Report: Haswell desktop TDPs to reach 84W

Mobile versions of Intel’s upcoming Haswell microprocessor are expected to have TDPs as low as 10W, a substantial reduction from the 17W thermal envelope of current low-power Ivy Bridge chips. Desktop versions of Ivy top out at 77W, and according to VR-Zone’s Chinese-language site, their replacements could carry TDP ratings as high as 84W. That information comes from a table of purported specifications for Intel’s next-generation desktop processors.

According to the table, Haswell desktop CPUs will come in two flavors: standard power and low power. Only the standard-power chips will be rated for 84W, while the low-power group will have thermal envelopes in the 35-65W range. Enthusiasts will likely be most interested in the Core i7-4770K and i5-4670K, which will supposedly have fully unlocked multipliers and clock speeds identical to the existing i7-3770K and i5-3570K. The fastest low-power Haswell variant, the i7-4770S, will reportedly match the flagship’s 3.9GHz Turbo peak but have a base clock speed of only 3.1GHz, a 400MHz reduction from the i7-4770K.

Interestingly, VR-Zone’s information reveals only one dual-core model, the Core-i5-4570T. The table doesn’t include any Core i3 variants, though. I suspect the initial batch of Haswell duallies will be reserved largely for mobile use. We saw something similar with Ivy Bridge, whose Core i3 desktop models didn’t arrive until months after the first i5 and i7 CPUs.

Overall, the leaked Haswell specs don’t show many differences between the incoming chips and existing Ivy Bridge CPUs. I’m curious to see how the real-world performance and power consumption of the two generations will compare. So is everyone else, I’m sure, but we’ll probably have to wait until the spring for a definitive answer on that front.

Comments closed
    • joselillo_25
    • 7 years ago

    This means that AMD Trinity is a better chip than the reviews have said. You simply cannot scale up the GPU perfomance without power so in the terms of an APU is a good buy. Too bad AMD has financial and marketshare problems these days.

    • rrr
    • 7 years ago

    Higher TDP than Ivy Bridge, not a good sign at all.

      • Arclight
      • 7 years ago

      Does that mean that Nehalem was a complete disaster compared to Penryn?

        • rrr
        • 7 years ago

        You mean “not a good sign” and “complete disaster” are interchangeable?

        Well, then your gf told me your sexual performance is not a good sign…

          • Arclight
          • 7 years ago

          Huh?

          C2Q 9xxx, 95 W —> i7 9xx, 130W, a 35W difference for 2 CPUs launched at around the same price point compared to Ivy 77W and Haswell 84W, a difference of 7W

          So no “not a good sign” and “complete disaster” are not interchangeble since the difference between Penryn and Nehalem, in terms of TDP, was clearly higher compared to Ivy vs Haswell.

            • Sahrin
            • 7 years ago

            Apparently “Complete disaster” means 28.

            • willmore
            • 7 years ago

            C2Q QX6800 65nm and 130W at 3.0GHz
            C2Q Q9850 45mn and 95W at 3.0GHZ (and 50% larger cache)
            i7 9xx 45nm and 130W

            Looks pretty normal for its place in the tick/tock cycle.

            • Arclight
            • 7 years ago

            You people just don’t get it. I was pointing out to rrr that he’s wrong to consider a bad sign 7W more of TDP for Haswell, compared to Ivy, when we have way bigger precedents in the past that proved that the added TDP was worth it, from a performance point of view.

            • willmore
            • 7 years ago

            And I was showing that it’s normal for the new arch on the same old process to use more power and that it’s generally worth it.

            • Arclight
            • 7 years ago

            and i never questioned that, that’s exactly what i meant as well…….

            • willmore
            • 7 years ago

            We are in violent agreement, then!

      • CppThis
      • 7 years ago

      Ivy Bridge had way too tight an envelope, as anyone who’s tried to overclock one can tell you. That being said, Intel needs to keep their eye on the ball and remember that the chip of the future is a low-power quad 3.x GHz. Hopefully the efficient ones don’t suck becuase it’s going to be a while before AMD can compete here.

        • NeelyCam
        • 7 years ago

        I still think something went wrong with Ivy Bridge (considering the high idle VOLTAGE levels SPCR was showing). I don’t know what, but SB operating from a lower voltage than IB is just plain wrong.

        I hope they figured out what went wrong in time for Haswell, because that’s what I’m waiting for..

    • Arclight
    • 7 years ago

    84W, Ha, my low end aftermarket cooler laughs at it. BTW any word on Haswell’s motherboards? Will the socket be different enough that they will have to change the layout for the holes for mounting the coolers? That would be troublesome….

    • Jigar
    • 7 years ago

    If this performs on par my expectations, i might finally replace my strong Q6600.

      • Final-Reality
      • 7 years ago

      Amen. I finally replaced my GPU from an 8800 GTS 512 to a 660 TI and kept the rest of the system from that era (Q6600 G0 stepping) and I still see no reason to upgrade. I’m guessing if I plotted my framerate like TR does for their reviews, I’d see quite a few spikes in games like FC3 and BF3 but I can deal with that for now 🙂

        • EJ257
        • 7 years ago

        I’m in the same boat. My 8800 GTX died last year so I replaced with a 6970. Everything else (except for the Samsung 830) is still vintage 2007. I think I waited long enough.

    • ronch
    • 7 years ago

    Intel must have cranked up the GPU specs for Haswell for it to reach 84w like that, compared to Ivy’s 77w rating. Gamers will probably just turn the iGPU off. TR should compare Haswell to Ivy, with both using discrete graphics.instead of their iGPUs, to see if the CPU cores themselves draw more power or if Haswell’s 84w rating is simply because of the beefed up iGPU. Hopefully, Steamroller will also be a lot more energy efficient so it can be more compelling compared to Intel CPUs than current AMD CPUs are.

    • Derfer
    • 7 years ago

    84>77
    I guess TDP is up because of the integrated VRM?

      • NeelyCam
      • 7 years ago

      Ah, that would make sense

      • willmore
      • 7 years ago

      I don’t see why. The only part of the VRM on die is the control portion. The MOSFET drivers, MOSFETS, caps, and inductors are still next to socket. Look at video cards. The VRM controller is usually a little TSSOP chip that uses very little power.

        • Derfer
        • 7 years ago

        Was it always explained to be just the control chip? That wasn’t really the impression given early on. I think most were counting on that to help lower board prices and increase reliability on low end boards.

          • willmore
          • 7 years ago

          Well, on-chip inductors aren’t able to handle the power levels necessary. The process used to make logic isn’t the same as used to make power MOSFETs, not to mention that it would be a horrible waste of a 22nm process to make such mundane parts.

          From the slides I’ve seen, simplifying the MB wasn’t their primary concern. Being able to more quickly change the voltage provided to the chip was. Right now, to change frequency up, the CPU has to drive the new VID value on the VID pins and then wait for the VRM to say “okay, we’re good” before it can change the clock speed. Pulling the VRM controller on chip takes away that signaling overhead and delay. The benefit should be a processor which can adapt to load conditions much faster–allowing the chip to be more agressive in dropping to lower power states as there is vastly less latency to return to higher clocks.

            • NeelyCam
            • 7 years ago

            [quote<]From the slides I've seen[/quote<] Link? You don't have to have inductors on-chip; I've seen LC oscillator implementations where high-Q inductors are off-chip, with everything else is on-chip. [quote<]The process used to make logic isn't the same as used to make power MOSFETs[/quote<] CPUs haven't required "power MOSFET" (e.g., >5V) voltages for years. Or, what do you consider "power MOSFET"? The output voltages would probably be well below 1.5V in every case... so the control circuitry should be relatively easy to integrate with the rest of the chip

            • willmore
            • 7 years ago

            Take a look at a motherboard. Look around the CPU socket. You’ll see little black things with three leads on them. Those are power MOSFETs. They take the 12V from the PSU and convert it (with the help of an inductor and a capacitor) to the low voltage that the CPU needs. Google SMPS if you’d like to know more.

            Next to those MOSFETS, inductors, and capacitors, you’ll find a small chip with a couple dozen leads. That’s the VRM control chip. It takes the VID signals from the CPU and then drives the MOSFETs to produce the desired voltage. That chip is the only part of the CPU power regulation circuitry that makes any sense to put onto the CPU.

            The slides were over at Anandtech or Realworldtechnology about a month back. They should be fairly easy to find.

    • Tristan
    • 7 years ago

    Their tri-gate transistors does not work ? Or maybe this was marketing feature.
    They must switch to SOI

      • brute
      • 7 years ago

      marketing feature? i dont think pablo rodriguez of fresno, california is going to know what that even means when he goes shopping for a new pc

    • jwilliams
    • 7 years ago

    It seems odd that every CPU on that list has the same GPU, HD Graphics 4600.

      • mganai
      • 7 years ago

      Edit: Oh, nevermind.

    • Tumbleweed
    • 7 years ago

    “Socket 1150”?

      • NeelyCam
      • 7 years ago

      Seems to be a standard practice now – one socket per tick-tock cycle. I guess something got integrated, so fewer pins are needed

    • chuckula
    • 7 years ago

    If those slides are to be believed there is at least one thing coming with Haswell: Pretty much all of the desktop chips will have the same GPU with the only difference being (relatively small) changes in the maximum frequency. Everybody gets a flavor of the “HD 4600” which is probably the middle of the pack “GT2” configuration. Not going to beat full-sized Trinity at GPU, but should be a worthy step up from Ivy Bridge.

    • Bauxite
    • 7 years ago

    i7 4765T is interesting, 35W TDP still gives you a full-featured 4C8T @ 2.0/3.0t

    Lots of potential fun to be had with passive/embedded designs there.

      • Srsly_Bro
      • 7 years ago

      I think you meant interestly.

        • NeelyCam
        • 7 years ago

        This is starting to get a bit stale… not thumbdown worthy yet, but close

        • Bauxite
        • 7 years ago

        Sorry, don’t speak Engrish

          • willmore
          • 7 years ago

          Your English is fine, Srsly_Bro’s the one who needs some help.

    • Goty
    • 7 years ago

    Haswell might just be tempting enough for me to finally upgrade my poor i7 920. Well, that’s if the discounts on IVB chips aren’t too much to pass up.

      • NeelyCam
      • 7 years ago

      When IB came out, SB became a sweet deal

      • MadManOriginal
      • 7 years ago

      If you have a Microcenter within a reasonable distance there will always be a very nice CPU deal for you. They’ve been fire selling Sandy Bridges, more so than their usual discount, for a little while now – like $100 2500K’s. It will be fun to see what Ivy Bridge goes to when Haswell comes out.

    • jdaven
    • 7 years ago

    I hope all the extra power goes into the integrated GPU.

      • BestJinjo
      • 7 years ago

      I hope the opposite that it goes towards the CPU since CPU speed has started to stagnate since Core i7 920-965 days.
      [url<]http://www.guru3d.com/articles_pages/far_cry_3_graphics_performance_review_benchmark,7.html[/url<] With Intel's crappy GPU drivers and current performance, they are at roughly 80-100% behind A10-5800K in gaming performance in less GPU-demanding games where you can actually reasonably use the APU. Who is going to be using an APU in Crysis 1-3, Metro games or BF3/BF4, etc.? Chances are the APUs are used in laptops / HTPCs for playing less GPU-demanding games. Starcraft 2 [url<]http://images.anandtech.com/graphs/graph6332/50122.png[/url<] Minecraft [url<]http://images.anandtech.com/graphs/graph6332/50163.png[/url<] Civ 5 [url<]http://images.anandtech.com/graphs/graph6332/50165.png[/url<] Despite all the hype, Haswell GPU will be nothing special since the gap even against current gen A10 is enormous. Even if it beats A10, it won't be enough for modern games as A10 is not fast enough for modern games today above 1680x1050. Moreso, next year games will be even more GPU intensive, such as Crysis 3, Metro LL even 2-3x increase in power will be nothing in those titles. If we look where Intel HD4000 sits, it's barely as fast as X1800XT/7900GT, but even A10-5800K (HD7660D) is slower than 8800GTS 320mb/3850. [url<]http://alienbabeltech.com/abt/viewtopic.php?p=41174[/url<] Realistically to even start thinking about playing modern games at decent settings, you'd need an APU 4.4x faster than HD4000 or 2.5x faster than A10-5800K, which means around HD7750 level. Arguably the most important point is that desktop 4000 series are not likely to be using Haswell's most powerful version of the 40 graphics unit APU, as that will be saved for mobile versions. Even now low-end i3s use HD2500 and not HD4000. For a budget gaming system, a gamer is better off buying an i3 + dedicated GPU or even an FX8320 + OC + dedicated GPU. Having a powerful APU in something like an i5-4670K/4770K is backwards thinking since those CPUs are used with dedicated GPUs for games. Who buys a $325 CPU and uses its APU for games? Makes no sense as i3 / FX6300 + GTX660 would mop the floor with the APU based i7-4770k. And of course with PS4/Xbox 720 launching next year, anyone who can't afford a real gaming PC with a dedicated GPU would be better off gaming a console. APUs are still too slow and will be for a while, unless all you play is indie games.

        • mganai
        • 7 years ago

        FC3, like most other games, is still GPU bottlenecked. The 965 there was overclocked to the same default as the more modern CPUs. Then again, no Intel hexa-cores were sampled. (Not sure how much of a difference it would make though.)

        Most programmers don’t know what to do with multiple threads. There are still very few games that require a quad-core.

      • chuckula
      • 7 years ago

      For once I up-thumb jdaven when everyone else down-thumbs him… oh the irony.
      For the record as a “purported” Intel fanboy: I hope Haswell is using the power for the GPU too since that is the major weakness for Intel’s desktop line compared to the competition.

      • hasseb64
      • 7 years ago

      For sure the GPU draws much more power!
      It should perform x2 up from IVY-bridge and that should be some watts, maybe 10W?.

      • Final-Reality
      • 7 years ago

      Why? The major bottleneck will be the memory bandwidth until high-speed DDR4 is released 4(?) years from now.. it’s basically wasted silicon.

    • brute
    • 7 years ago

    with the constant dropping of TDPs, how am i to get hot and bothered over a new cpu?

    nothing got me sweaty like being in a closed room with a Pentium 4 or Phenom X4 pegged at 100%

      • MadManOriginal
      • 7 years ago

      Two CPUs at the same time 🙂

        • sweatshopking
        • 7 years ago

        oh god. pentium D @ 4.2 ghz used to get it steamy in my room. especially when they were really working it.

          • brute
          • 7 years ago

          computerotica

          my new favorite

            • dpaus
            • 7 years ago

            [quote<]computerotica[/quote<] From that prude?!? Not happenin'....

        • superjawes
        • 7 years ago

        “Lawrence, If you had a million dollars, what would you put in your computer?”

    • DPete27
    • 7 years ago

    The TDP increase is probably to accomodate for the top-end (GT3) IGP in the K-series CPUs.

    Sure, put the GT3 in i7’s since they’re more likely to be used in workstations without an accmopanying discrete GPU, but the i5-4670K will be the chip for gamers that will be using discrete GPUs and not needing the extra heat and cost associated with the GT3 IGP. Just use GT2 graphics there and keep the TDP at 77W.

    If you don’t know what the Haswell graphics tiers are, [url=http://www.anandtech.com/show/6355/intels-haswell-architecture/12<]read this article[/url<]

      • mganai
      • 7 years ago

      I didn’t see anything about the GTs here.

      From what I recall, GT3 will be reserved for laptops/ultrabooks.

        • dpaus
        • 7 years ago

        [quote<]I didn't see anything about the GTs here[/quote<] Uh, 2nd paragraph of the article......?

          • mganai
          • 7 years ago

          Which article?

          Every source I’ve read (including Intel’s own slides) states that GT3 will only be for the mobile markets.

          If the specs (i. e. clock speeds, HT, etc.) are to remain the same, and even at a slightly higher TDP, I am disappointed. Sandy Bridge may have been more about efficiency than power, but there was an across the board spec bump.

          I wonder if the TSX instructions will make the i7 a more sensible choice than the i5.

        • alwayssts
        • 7 years ago

        That’s my understanding too, which makes sense.

        My thinking is the K series will use GT2 for a couple connected reasons.

        First is power/heat and directing that where it’s most useful in that sector, which is obviously the CPU.

        Second being bandwidth. Without the extra buffer it makes sense to limit the internal gpu to something that can use the available bandwidth from the memory controller without hampering the cpu.

        This also correlates to clockspeed. Just as an example, a GT3 may be 40 units (160 shaders) with a clock on the sweet spot of the voltage/power curve that will never be user-adjusted, but exceed what the IMC can handle. GT2 may be half that spec, but be open for clock adjustment up to a level that makes sense with the available bandwidth from the memory controller/ddr3 (perhaps also coinciding with overclocking).

        While one can argue putting their eggs in the mobile basket, I think a large part of it is trying not to make the same ‘mistake’ AMD did wrt design decisions associated with clock/power curves and available bandwidth shared between the cpu/gpu.

        To put it another way, I think GT3 parts will be about low-power, high amounts of pertinent logic (ie dual core is more relevant than quad in some markets, even with a strong gpu) and making the most out of the chip from a power standpoint. I would associate this with old nvidia-think (more logic, lower clock at low voltage).

        GT2/K otoh, more old ATi-think. Less logic, higher clock…maximizing the potential of the core within an equal or higher power envelope.

        If that is the way they are going about it, it’s kind of clever imho. Not only does it allow them to attack the different markets from several different angles with one chip, it allows them a multitude of options for binning.

      • OneArmedScissor
      • 7 years ago

      There is no 77w Haswell, so it’s not really accurate to call it an increase.

      Haswell has integrated voltage regulators, so irespective of the CPU and GPU architectures, the chip itself is quite different from Ivy Bridge, and they’d need to account for that for OEMs that need to figure out how to cool them.

      If you look at the table, they all have HD 4600 graphics, not 5000+. I believe Intel stated only mobile parts will get the highest end GPU, which has a frame buffer chip tacked on. They’d almost certainly give those a different designation.

      • MadManOriginal
      • 7 years ago

      GT3 in desktop chips would be nice for Quicksync.

    • chuckula
    • 7 years ago

    Brother Maynard! Consult the table of purported specifications!

    Brother Maynard: A reading from the table of purported specifications, chapter four, verse three. “And thou shalt taketh thine Haswell and pulleth the pin. Then shall thou counteth three. Thou shall counteth two only if thou then proceedeth directly to three. Thou shalt not counteth to four. Three shall be the number of the counting, and the number of the counting shall be three. Five it right out.

    Then after having counted to three, shalt thou lob thine Haswell at thine enemy who, in thy mercy, shalt snuff it.”

      • dpaus
      • 7 years ago

      +1 for the Monty Python reference. -1 for the multiple typos that ruin it.

      And next time someone rags on Apple, I’m bringing up The Witch! skit.

        • chuckula
        • 7 years ago

        See that’s the problem with having to post first… too much pressure.

          • dpaus
          • 7 years ago

          Hey, if you can’t take the pressure….

            • chuckula
            • 7 years ago

            buy a vacuum pump?

            • dpaus
            • 7 years ago

            What you do in the privacy of your own home is your business, and the only one here who’s interested is SSK. Well, and Neely if he thinks he’s being excluded.

            Please use PMs to tell them about your, um, appliances. It’s just TMI for the rest of us…

            • sweatshopking
            • 7 years ago

            first off, i’m a prude. second, neely is a female.

            • dpaus
            • 7 years ago

            That makes my Coming-Soon-To-A-Pub-Near-You beer-n-wings victory over [s<]him[/s<] her all the more interestly.

            • sweatshopking
            • 7 years ago

            HAVE YOU SEEN HER!?!?!!? “INTERESTLY” IS THE RIGHT WORD, “FUN” IS NOT.

      • derFunkenstein
      • 7 years ago

      Really, really well-done, Count Chuckula.

        • chuckula
        • 7 years ago

        Well.. when the news is frankly pretty boring, you have to do something to make it interesting.

      • brute
      • 7 years ago

      THINE = AN
      THY = A

      GENERAL RULE.

      THY HASWELL. THINE AGENA.
      THY PRESCOTT. THINE OROCHI.

        • dpaus
        • 7 years ago

        Thou art a party-pooper.

          • brute
          • 7 years ago

          thou not interestly

            • dpaus
            • 7 years ago

            Thy party art pooped too, verily.

          • flip-mode
          • 7 years ago

          Thou meanest thy party-pooper.

        • Squeazle
        • 7 years ago

        BRUTE KNOWS GRAMMAR GUYS.

        It’s just a few centuries out of date. I feel bad for ragging on him all this time now.

          • brute
          • 7 years ago

          ou t of date? i have no problem gtting date. ladies love me

            • dpaus
            • 7 years ago

            You know, when I saw your POST IN ALL CAPS, I immediately suspected you were SSK. This post confirms it.

            • brute
            • 7 years ago

            im not ssk

            reason 1. i was demoncraptically HE-lected
            reason 2. he is a kng
            reason 3. do he math

            • NeelyCam
            • 7 years ago

            Careful… they might [i<]look[/i<] like ladies, but... I guess you missed The Crying Game

        • derFunkenstein
        • 7 years ago

        Depending on the accent, it may be thine ‘aswell.

    • brute
    • 7 years ago

    regarding chips: salsa or dip?

      • NeelyCam
      • 7 years ago

      Salsa. Always salsa

        • dpaus
        • 7 years ago

        I made myself some roasted garlic salsa, with lots of roasted garlic and finely-chopped fresh red onions. I’ll never go back to plain salsa again.

        • derFunkenstein
        • 7 years ago

        My general rule is salsa for tortilla chips (or even Fritos if you’re feeling messy), dip for everything else.

          • BobbinThreadbare
          • 7 years ago

          That’s what Frito scoops are for.

      • albundy
      • 7 years ago

      I dunno, seems very spicy already. all i need is a benchmark topping.

      • Squeazle
      • 7 years ago

      MOLTEN COPPER DIP.

      One time I only had the tiny chip shards at the bottom of the bag though, so I poured it into my dip and ate it like a bowl of cereal.

      It wasn’t that good.

      • Srsly_Bro
      • 7 years ago

      Now I have to go to the store to get chips and dip. Thanks, brute.

        • brute
        • 7 years ago

        ur welomce

Pin It on Pinterest

Share This