32nm Intel Westmere to hit production in late 2009

Sometimes, just sometimes, unsourced rumors from small Asian tech sites turn out to be spot on. During a briefing earlier today, Intel confirmed plans to start producing 32nm dual-core Westmere processors in the fourth quarter of this year—and the company says it may never introduce 45nm dual-core Nehalem derivatives.

To understand Intel’s latest desktop and mobile roadmap, you first need to get to grips with the latest batch of code names. Think of Westmere in the same way as Penryn: a new core based on an older architecture that will form the basis for a line of products. In this case, Westmere is more than a mere die-shrink. The design includes only two 32nm cores with 4MB of cache, and its memory controller resides on a separate, 45nm die together with an integrated graphics chip. Like so:

Penryn on the left, Westmere on the right. Source: Intel.

Westmere CPUs will have some of the same perks as their Nehalem brethren. Those include Hyper-Threading, which will allow each core to juggle two threads, and Turbo Boost, which will let the CPU shut off one core and clock up the other to improve single-threaded performance. Intel has also tossed in seven new instructions for accelerating encryption and decryption algorithms, which should come in handy for tasks like full-disk encryption in software.

As for the graphics core, Intel said that will be a tweaked 45nm version of its existing 65nm integrated graphics processor. The smaller process will allow for higher clock speeds, and Intel claims proximity with the memory controller will reduce latency and improve memory bandwidth. The firm named reduced latency as a primary reason for moving the IGP into the CPU package, but that explanation sounds a tad dubious. (Latency seems like the least of Intel’s problems on the integrated graphics front right now, and “discrete” IGPs from AMD and Nvidia perform considerably better.) In all likelihood, Intel wants to elbow out other IGP makers and perhaps give itself the option of upgrading IGP cores without needing to change CPU cores and chipsets.

Intel boasted that its very first Westmere silicon was capable of running a PC and booting up, saying this indicates “very robust process health” and a “very robust product.” In fact, Intel showed Westmere-powered desktop and notebook systems running an operating system and applications after the briefing. Westmere’s precocity is why Intel opted to “de-prioritize” 45nm dual-core Havendale and Auburndale processors, which means the chips might never launch at all.

So, that’s Westmere. Intel will use Westmere to build two products at first: Clarkdale on the desktop and Arrandale for notebooks. Intel suggested that both CPUs will have similar clock speeds and power envelopes as existing mainstream desktop offerings, and they should retain the Core name, as well. Both Clarkdale and Arrandale will be dual-core products, and they should hit production in Q4 2009—presumably in time for a launch early next year, although Intel says it still needs to work out a launch schedule with partners.

A “little later,” Intel will follow up with Gulftown, a high-end desktop processor with six cores and twelve threads. Since it’ll be Westmere-based, this product may have three dual-core processor dice in the same package. Gulftown will team up with the same X58 chipset as today’s Core i7 CPUs.

In the meantime, Intel has another pair of 45nm processors lined up for the second half of this year: Lynnfield for desktops and Clarksfield for notebooks. Both CPUs will have four cores and eight threads just like the Core i7, but they’ll be part of a different platform with a new socket and new 5-series chipsets. Incidentally, Intel said 32nm Clarkdale and Arrandale CPUs will work with that same platform. The new roadmap says nothing of quad-core Westmere offerings, hinting instead that Clarkdale, Arrandale, Lynnfield, and Clarksfield will simply co-exist at different price points until Sandy Bridge—the upcoming 32nm architecture refresh—comes along.

Comments closed
    • ryko
    • 12 years ago

    i think i will wait for “sandy bridge” seeing as i just picked up a q9300 a few weeks ago…although core i5 will be tempting. not really interested in westmere for some reason.

    • SpotTheCat
    • 12 years ago

    Man, you guys are harsh. I see this as an amazing thing for laptops. I’ve found that I use the “stamina” setting (integrated Intel) more than the “speed” (nVidia 8400) on my switchable graphics laptop anyways. It runs longer, the drivers work better when switching between laptop and presentation, and it’s not as hot on my lap.

    I’d be double excited if this new IGP could hardware decode blu ray disks without fault.

    • maroon1
    • 12 years ago

    So Gulftown, a high-end desktop processor with six cores and twelve threads, will support X58 motherboard !!!

    Thats a good news for i7 users

    • Voldenuit
    • 12 years ago

    -[

    • Usacomp2k3
    • 12 years ago

    What I think is interesting is that they’re pulling the memory controller out of the CPU like it is with the Nehalem. (ie that it’s on a separate piece of silicon even though it’s technically on the the same CPU)

      • UberGerbil
      • 12 years ago

      Yeah, it is an interesting design. Of course without local video memory the GPU has more significant memory bandwidth/latency needs than the CPU does. And from the CPU’s point of view, this is just a NUMA situation where it doesn’t have any local memory of its own (like some of those 2P Opteron boards that only had memory hanging off one socket). Latency will be higher, but probably not enough to matter on most real-world tests. And having it in the same MCM is still better than having your northbridge sitting off in another socket altogether, as with Core 2.

        • Usacomp2k3
        • 12 years ago

        That’s a very good point. I had forgotten that the GPU would need access to memory. That’s a pretty significant oversight on my part.

    • ish718
    • 12 years ago

    I wonder how that dual core+ IGP “Westmere” with HTT would perform against a core2quad in multithreaded apps
    4 physical cores VS 2 physical cores + 2 threads

    • piesquared
    • 12 years ago
      • UberGerbil
      • 12 years ago

      Apparently you missed the entire “Phenom II” section in today’s shortbread?
      §[<https://techreport.com/discussions.x/16394<]§ I expect there's a review or two (processors, chipsets, mobos, oh my!) in the works. But TR traditionally has worried about having the review be /[

        • ssidbroadcast
        • 12 years ago

        Yeah by my guesses Scott must be burning the midnight oil right about now.

        • piesquared
        • 12 years ago
          • UberGerbil
          • 12 years ago

          The smell of troll is strong in this one.

          Yeah, Techreport is sitting around raking in the cash from Intel, and so am I. And you’re sitting in your parent’s basement shaking your impotent little fists about it.

      • xlmohi
      • 12 years ago

      A little donut for BIG Giant.

      I guss PIESQUARED asking about NEWS in FP.

      this also led me to think as him!!! (((biased)))

      • green
      • 12 years ago

      listen to the podcast
      can’t remember now but i thought i heard scott saying there was a cpu article coming out this week

    • Voldenuit
    • 12 years ago

    More likely havendale was scrapped because intel is having trouble making a current-gen GPU, so they pushed it back to the 32nm node timeline.

    Looking at intel’s mess of IGPs in the past 10 years made me breathe a sigh of relief (G35 user here).

      • ssidbroadcast
      • 12 years ago

      Er, I think the target-spec for something like this would be close to what a GMA950 already does, ie normal 2D DirectDraw stuff w/ GeForce2-era 3D abilities.

    • glacius555
    • 12 years ago

    *scratches his head*
    Errr.. So should I upgrade to Core i7 now, in April?? Or will this socket die out, like LGA775??

      • ish718
      • 12 years ago

      No. LGA1336 will not die out. Well not that soon.
      The six core 12 thread “gulftown” will also use LGA1366

        • UberGerbil
        • 12 years ago

        Not that it really matters for this question, but Intel is also the same socket for Gainestown, the 2P server (which will use regular DDR3, not the FB-DIMMs that will be required for the MP line) — though it’s unclear if they’ll be totally compatible.

      • UberGerbil
      • 12 years ago

      There’s going to be more than one viable socket, just like AMD in the 754/939/940 days. If you want to get on the expensive, enthusiast/workstation track, buy an i7 now. If you want to get on the mid-range/value track, buy an i5 when it comes out. It just seems confusing right now because the full Intel lineup isn’t out yet — just their server CPU pretending to be an “enthusiast” chip.

      Of course lots of people never get around to swapping CPUs, and just move on to a new motherboard and new CPU later (if you buy i7 now to “futureproof” yourself, then by the time you’re thinking about upgrading your CPU will you be wishing your mobo has USB 3.0, next generation SATA, a nonvolatile storage controller, etc?)

    • kamikaziechameleon
    • 12 years ago

    honestly intell making IGP is just bad news, since they sell products that are far less capable than anything there competitors put out. intel is famous for three things, amazing CPUs, making money, and horrible IGPs. them making another IGP is like telling me bush is running for a third term.

      • MadManOriginal
      • 12 years ago

      One difference is that what Intel is doing is not unconstitutional :p not that that stopped GW from doing some things but a third term would have been too far.

    • grantmeaname
    • 12 years ago

    b[

    • indeego
    • 12 years ago

    Intel has been promising decent graphics performance for decades. If it hasn’t happened yet, it ain’t gonna happen with thisg{<.<}g

      • ClickClick5
      • 12 years ago

      I raise my glass here sir. If AMD pulls off there (no longer called Fusion) GPU/CPU design, it would be good. Imagine a R700 slabbed onto one of there Bulldozer cores. AND it would still be cheaper then the whoo-hoo intel GPU.

    • eitje
    • 12 years ago

    I have never understood why the CPU+GPU thing is so attractive.

      • flip-mode
      • 12 years ago

      It is unclear to me as well. It is attractive when it comes to system upgrades. Perhaps it has something to do with stream computing where processing can be shifted from the CPU to the GPU and back again with very low overhead? Dunno. Ask Uber-Gerbil.

        • OneArmedScissor
        • 12 years ago

        Cheap is attractive. They aren’t exactly trying to sell you this because it magically makes Crysis run at 2560×1600 with the graphics maxed out. Think the other way around.

        • ssidbroadcast
        • 12 years ago

        Yeah, ask UberGerbil.

          • ClickClick5
          • 12 years ago

          lol, I’ll fill in till he gets here.

          For the mobile market, say laptops, handhelds and theses new nifty nettops, this is really an attractive idea. For cheap (I call them “Myspace/iTunes” computers) desktops, this is also a cheap alternative. Easier to put together.

          • UberGerbil
          • 12 years ago

          Ask Ludi. What he said 😉

          For Intel it’s about delivering more platform at less cost, though it likely will pass only some of those cost savings on to the OEMs (ie higher margins on the budget part of the line).

      • ludi
      • 12 years ago

      1) Instant lock-in. Intel is guaranteed to get the full system sale.

      2) Fewer discrete chip packages makes for smaller, cheaper system boards and easier system cooling, which translates into a cheater, lighter laptop.

        • ssidbroadcast
        • 12 years ago

        q[

          • Meadows
          • 12 years ago

          No, I think he means this will be a professional Counter-Strike tool for LAN parties.

          • ludi
          • 12 years ago

          Haha. Sorry, I’ve got some sort of endocrinal imbalance going on right now in conjunction with a lot of lost sleep. Waiting on bloodwork to see what it is. Meanwhile, I tend to kinda crash in the afternoon/early evening for a few hours.

      • Krogoth
      • 12 years ago

      It is for OEM vendors to help assist to make their mainstream boxes smaller and simpler.

      A build-in IGP would be quite handy in times where the primary GPU is having problems and/or going through a RMA.

      This was a large part of AMD’s Fusion program, but it seems that Intel has beaten them to the punch.

      • shank15217
      • 12 years ago

      AMDs vision of fusion was more than CPU+GPU, it was CPU and a specialized processor for accelerating graphics or physics or other types of scientific computations.

        • OneArmedScissor
        • 12 years ago

        Yes, the “accelerated processing unit,” or ACU, as they called it. I believe the intention of the first is to make it handle HD video for laptops.

        • UberGerbil
        • 12 years ago

        Which makes it more like Larrabee

      • Thresher
      • 12 years ago

      I agree, this seems like it would be a nightmare to support.

    • My Johnson
    • 12 years ago

    I’ll only buy one (say a laptop) only if it comes with a discrete Nvidia or AMD GPU.

      • green
      • 12 years ago

      i’d give it a crack anyway and disable the igp as apparently an i7 core can beat they igp in dx10/11 processing (or something like that)
      but either way, if you don’t run enough to use the second core at all, may as well put it to work doing something useful

    • shiznit
    • 12 years ago

    this will be bad for nvidia. I wonder what they can do when Intel graphics are already on die and you can’t run 2 different video drivers at once…

    edit:
    nevermind, the uncropped image from §[<http://download.intel.com/pressroom/kits/32nm/westmere/32nm_WSM_Press.pdf<]§ clearly says "switchable graphics support" under the picture.

      • accord1999
      • 12 years ago

      Some notebooks already have Switchable Graphics, which allows switching to the IGP to save power, or switching to a discrete video card for more performance with just a couple of mouse clicks. So Nvidia always has that option in the future.

        • shiznit
        • 12 years ago

        true but are the igp and the discrete gpu from 2 different companies? afaik you can only have 1 graphics driver installed.

          • accord1999
          • 12 years ago

          The nice thing is Switchable Graphics doesn’t require the same manufacturer. I have a Thinkpad W500 which can switch between a Intel 4500 IGP to a ATI FireGL 5700.

            • shiznit
            • 12 years ago

            Ok I didn’t know that. That explains a lot, thanks.

      • Captain Ned
      • 12 years ago

      Certainly explains the rumors of nvidia working on an x86 CPU, license be damned.

      • indeego
      • 12 years ago

      You can already run onboard intel, add-in pci matrox, and add-in pcie ATI or Nvidia with few issuesg{<.<}g

      • OneArmedScissor
      • 12 years ago

      Nah, by that time, Nvidia’s integrated graphics will make it pointless for a very large number of laptops to have discrete graphics, so they’ll undoubtedly be commonly paired with the quad-core parts. The days of integrated graphics not being powerful enough for modern games are about to come to an end. Most laptops don’t run high enough resolutions to need a Radeon 4800 or GTX 200, and that’s not going to change, but what will change is that it won’t be long before something like a 9600GT comes built straight into your motherboard.

      The CPU+GPU Westmeres will be like the Pentium Dual-Cores. They will be for the most dumbed down laptops of all, so as not to steal the thunder of higher end parts, and GOOD integrated graphics will be able to move up from that level.

      I think it’s also possible that discrete graphics could become more common, and the midrange on laptops could potentially disappear. You’d have the very low end that doesn’t do much of anything, and then it would just skip straight to a borderline desktop replacement.

      Considering that there will be a standardized line of mobile quad-cores for anything beyond the bottom of the barrel, and that GPUs should be much cheaper and power efficient by then, a “midrange” laptop could very well be the equivalent of a desktop with a Core 2 Quad and a 9800GTX.

        • shiznit
        • 12 years ago

        makes sense, but it worries me a bit. Midrange laptops going away will hurt pc gaming. CPU power is plenty even in entry level laptops for most casual to semi-hardcore gamers. 2 cores is enough for anything except GTA4. What is lacking the most is decent GPU power, and this trend looks like it will make it even worse.

        of course Intel isn’t doing this to corner the laptop gaming market, but for the average WoW or CS player a dual core westermere + nvidia igp would be a pretty decent laptop.

          • OneArmedScissor
          • 12 years ago

          Why does that worry you? Desktops aren’t going anywhere. There are two reasons why laptops will never be able to compare:

          1) Screen size. LCDs just keep getting cheaper, larger, and higher resolution. But regardless, we aren’t going to be toting around 24″ laptops, and most people aren’t going to make it a point to spend ALL their time playing games on a smaller screen than they can get for dirt cheap. Could you plug another monitor into a laptop? Yes. But it will probably be higher resolution if you get a large one, and the GPU isn’t going to work for that. You will still need a desktop with modern parts to handle the highest resolutions.

          2) Have fun trying to upgrade a laptop. :p

          Laptops will still fill the “niche” they always have: portability. The difference is just that they will finally be able to fill it in ALL ways. They aren’t going to overtake computers completely.

            • Kurotetsu
            • 12 years ago

            l[http://en.wikipedia.org/wiki/ATI_XGP<]§ On the go, you could use the onboard graphics of your laptop. At home, and with your larger res monitor, you would plug your laptop into an XGP box which feeds your high-res LCD (or, hopefully, SED) screen. XGP was limited to only feeding back to the laptop's screen, but a recent announcement said it now supported feeding an external monitor.

      • MadManOriginal
      • 12 years ago

      Win7 will remove the 1 video adaptor driver model that Vista currently uses.

        • Usacomp2k3
        • 12 years ago

        Cool. Do you have a link to that?

          • UberGerbil
          • 12 years ago

          I haven’t seen it actually documented on msdn or whdc etc (and I’ve yet to hear of anyone actually trying it) but you can see it mentioned from slide #35 onward of this presentation from last November’s WinHec
          §[<http://download.microsoft.com/download/5/E/6/5E66B27B-988B-4F50-AF3A-C2FF1E62180F/GRA-T518_WH08.pptx<]§ (There's some interesting stuff in there about optimizations within the WDM to reduce memory usage in Win7 as well)

            • Usacomp2k3
            • 12 years ago

            Cool. Thanks. I’m glad to see that.

    • elty
    • 12 years ago

    Penrynon = 45nm Core 2 Duo
    Nehalem = Core i7, quad
    Westmere = “Core i7+”, dual with IGP module on the chip
    Clarkdale = Desktop version of Westmere
    Arrandale = Laptop version of Westmere
    Gulftown = a 6 core version of Westmere
    Sandy Bridge = something completely new
    Havendale / Auburndale = dual core version of i7
    Lynnfield / Clarksfield = Core i7 that use yet another socket

    Am I correct?

      • OneArmedScissor
      • 12 years ago

      Yup, except Havendale and Auburndale have been canceled and they will just skip straight to the 32nm versions. They weren’t quite dual-core Core i7s. They would have been more like dual-core+GPU Core i5s, which Lynnfield and Clarksfield are. They have dual channel memory controllers and Direct Media Interfaces instead of QuickPath Interconnets.

    • Hattig
    • 12 years ago

    I think Intel have the right idea here. Again. AMD is just blathering around selling tweaked massive server CPUs for $150 and putting back everything that sounds good. And I’m saying that as an AMD fan. They have the jigsaw pieces and I believe the finished puzzle would be great because of their graphics skills, but …

    • willyolio
    • 12 years ago

    actually, i find rumours that begin on small, asian tech sites seem to be more accurate than western sites. like the RV770 rumours.

    could be because so much manufacturing is done over there…

      • OneArmedScissor
      • 12 years ago

      A good deal of “rumors” I read from Asian sites are based on fact. Some I read show lots of actual pictures and test results of things that people managed to get their hands on, or literally found in a store before it comes out here. People working in the factories could easily get their hands on all sorts of things long before they see the light of day in the US. I can’t really think of any instance where I’ve seen a good one just plain make stuff up. They may find out MORE information as they go, but they keep it updated.

    • ssidbroadcast
    • 12 years ago

    q[

      • Cyril
      • 12 years ago

      Pretty much, yes.

      • Meadows
      • 12 years ago

      Exactly. It’s nothing to get confused about. Intel says they’re not intent on following AMD’s footsteps.

Pin It on Pinterest

Share This