Intel’s Broadwell goes broad with new desktop, mobile, server variants

As expected, Intel widened the reach of its 14-nm Broadwell CPU architecture across more of its product lines this morning with the introduction of a whole raft of new CPU models for desktops, laptops, and even servers. These CPUs promise improved power efficiency, higher per-clock performance, and faster graphics than the Haswell-based products that came before them.

The suite of desktop Broadwell processors looks like so:

All of these products are fundamentally similar and are likely just variants of a single type of silicon die.

The two products of most interest to do-it-yourself PC builders are the Core i7-5775C and the Core i5-5675C. These two parts are socketed processors that promise to drop into existing Z97 motherboards. They’re both unlocked for easy overclocking, and they both have quad cores with peak Turbo clocks of 3.6 and 3.7GHz.

Despite that fact, they’re not your typical high-end enthusiast fare. These C-series models have relatively modest 65W power envelopes and come with Intel Iris Pro 6200 graphicsβ€”the “Pro” denotes higher performance thanks to the presence of 128MB of embedded DRAM that acts as a graphics cache and as an L4 cache for the entire processor.

As a result, the desktop Broadwells may have more appeal for those building quiet, compact systems like home-theater PCs or mini-ITX rigs than for traditional ATX-size systems used by gamers and power users. That dynamic could be altered by some really promising overclocking results, but we’ll have to see. Intel says review samples of desktop Broadwell chips should ship soon, but we don’t yet have them in hand.

The Core i7-5775C will have quad cores and eight threads via Hyper-Threading and sell for $366. The i5-5675C will have quad cores with quad threads and will list for $276.

The other desktop Broadwell parts are intended for all-in-one PCs and small-form-factor systems like the Intel NUC lineup.Β  They’re meant for BGA-style mounting, not sockets, and will be consumed primarily by large PC manufacturers.

Above is a listing of the five different models of mobile Broadwell parts that Intel introduced for use primarily in larger laptops such as gaming-focused luggables and desktop replacements. They all have 47W power envelopes, well above the range of existing Broadwell chips. Most of them feature four cores, eight hardware threads, 6MB of L3 cache, and Iris Pro graphics with eDRAM acting as an L4 cache. Peak clock speeds are in the mid-3GHz neighborhood. As a result, Intel expects them to deliver much higher performance than earlier Broadwell-based processors.

In fact, it touts the gaming potential of the Core i7-5950HQ at 1080p in some of the world’s most popular games.

The firm also announced four new Xeon E3-1200-series products likely based on the same Broadwell die. These Xeon E3v4 parts have power envelopes ranging from 95W to 35W, and all feature Iris Pro graphics. The firm bills them as having “Intel’s most powerful data center graphics” and promises 1.4X faster video transcoding performance thanks to Broadwell’s improved QuickSync accelerator block. The press release claims these CPUs could deliver “up to 4,300 simultaneous HD video streams per server rack,” which gives you some sense of who the customers might be for these products.

Comments closed
    • deruberhanyok
    • 4 years ago

    So, big paper launch here, or what?

    • scienceomatica
    • 4 years ago

    It is boring because it pays more attention to laptop processors and their thrift. More attention should be given the right things, and start with an analysis of desktop processors for real gamer enthusiasts.

    • Chrispy_
    • 5 years ago

    It’s the low-end processors like the Celerons, Pentiums and i3’s that need better graphics. The minute someone has $250+ to spend on a processor, they can afford a GPU that runs circles around even this massively improved IGP.

    • ronch
    • 5 years ago

    “These CPUs promise improved power efficiency, higher per-clock performance”

    Higher per-clock performance?!?!?!

    Just hold on a minute! IPC improvements should only come from architectural changes and you, Intel, should only do that every ‘tock’! Every ‘tick’ you guys should only do die shrinks and nothing more!

    • NeelyCam
    • 5 years ago

    If only this was socket-compatible with Sandy/Ivy… then I might be interested in upgrading. But since it requires a mobo replacement, upgrade is too much money and work

      • chuckula
      • 5 years ago

      So the scorecard so far:

      1. Krogoth is as Krogoth Does.
      2. Neely is non-Krogothed in certain hypothetical situations, but is Krogothed in his present predicament.

    • danny e.
    • 5 years ago

    better without the integrated crapola graphics.

      • derFunkenstein
      • 5 years ago

      I want integrated Crayola graphics.

        • Ninjitsu
        • 5 years ago

        256 colours?

          • ClickClick5
          • 5 years ago

          640×480?

          • BIF
          • 5 years ago

          No, the 64 was the big one in my childhood. The box came with a built-in sharpener!

            • ronch
            • 5 years ago

            Ah childhood days! Crazy how toddlers today already know how to swipe left and right and pinch to zoom in and out! And we thought 64-color crayons made of WAX were frickin’ awesome back then. ^_^

            • derFunkenstein
            • 5 years ago

            Yup that’s exactly the set I was thinking of.

            • chuckula
            • 5 years ago

            As a kid I actually got to take a tour of a Crayola factory with my class.
            AWESOME!

            They also had samples of all kinds of weird crayon colors that went beyond your standard 64 color box. Probably for market testing and prototyping.

          • ronch
          • 5 years ago

          8 colors.

          CGA = Crayola Graphics Adapter

      • BIF
      • 5 years ago

      I don’t understand why you got downvoted. I am of the same opinion. Well, not that the integrated “apooh graphics” are bad, but just that it’s a waste of die space that could be better utilized by more cores or cache.

        • End User
        • 5 years ago

        I’m glad Intel throws in integrated graphics. I’m using the integrated graphics of my 3770K to drive my second display. Primary is driven by my SLI setup.

          • JustAnEngineer
          • 5 years ago

          Why couldn’t you plug your secondary monitors into your main graphics card?

            • End User
            • 5 years ago

            If I recall correctly the “Maximize 3D performance” options disables the second display when both displays are connected to the Nvidia cards. As I want a) 100% of my GPUs going towards gaming on one display and b) want a second display showing apps I use the integrated to get what I want. I like to have the second display running when I game so that I can see GPU/CPU/memory usage and temps without having any type of overlay over the game.

            • Freon
            • 5 years ago

            I have three monitors connected to one card of my SLI setup. I use borderless windowed mode in all games and leave browsers, Open Hardware Monitor, etc. open, sometimes even watching videos.

            “Maximize 3D performance” still works, and it certainly performs better than just with one card.

            • Waco
            • 5 years ago

            Borderless windowed mode disables SLI…doesn’t it?

        • Krogoth
        • 5 years ago

        It is because the intended market for the chips is mainstream OEM rigs and system integrators.

    • AJSB
    • 5 years ago

    To be honest both -C models seem to have excellent performance in games for people that want to use iGPU….better than ANY AMD APU even if only marginally in some games…shame that these -C models are so expensive in comparison to AMD alternatives.

    • atari030
    • 5 years ago

    It’s great to see another article regarding additional choice in the CPU space. More excellent additions to choose from.

    Little did I realize that the article was also a vehicle for the independent chuckula Intel marketing push. Doesn’t Intel already have a pretty healthy budget for that on their own?

      • chuckula
      • 5 years ago

      [quote<]chuckula Intel marketing push[/quote<] Meh. I'm not impressed.

        • atari030
        • 5 years ago

        Neither was I.

        How many posts do you have in this one article alone?

          • Beelzebubba9
          • 5 years ago

          Considering most of Chuckie’s posts had useful content and added to the discussion why are you beefing?

            • atari030
            • 5 years ago

            Well I’d say half were useful, a quarter were some kind of negative AMD comment, and the remaining quarter were goofy drivel. Props to the useful stuff.

            But beyond that, the [b<]real[/b<] reason I'm beefing is I'm actually STILL WAITING FOR MY D*** APOLOGY from him that I never received. I still believe in normal human relations and not the [b<]new age moron[/b<] etiquette the internet has spawned. Anyways, since I never received an apology, I just have fun knocking him every once in a while. He kinda deserves it sometimes, anyways. Even though he and I are probably much more the same than I'd ever like to admit.

            • BIF
            • 5 years ago

            AMD is screwing their own business. If only a quarter of my posts dissed them, I’d say I was falling down on the job.

            • f0d
            • 5 years ago

            [quote<]Well I'd say half were useful, a quarter were some kind of negative AMD comment, and the remaining quarter were goofy drivel. [/quote<] but thats why we all like the chuckster πŸ˜›

            • Wonders
            • 5 years ago

            Hey! Cut this out.

            • atari030
            • 5 years ago

            Don’t worry, I won’t bring it up again. Apologies to those whose sensibilities I offended.

            Beelze asked the why of the matter, so I gave him a pretty straight answer. Sorry it was personal.

            • maxxcool
            • 5 years ago

            ‘ropids

      • Visigoth
      • 5 years ago

      Hey assh0le, why don’t you take your sh*t with Chukula on p.m.? Nobody here is interested in the tripe you’re posting, or any other issues/problems you may have with a particular poster.

        • atari030
        • 5 years ago

        Seems to me you’d be better off following your own advice.

        • BIF
        • 5 years ago

        I generally have PM off, because well…it’s an odds-thing. I think a lot of others also have it turned off unless they’re expecting something from a specific person. Keeps the unsolicited spam under control, yes it does. πŸ™‚ Which reminds me, I need to check that setting…

        Edit:

        Yes, it’s off. One of my dearest wishes is to someday in my lifetime, be able to completely shut off the US Postal Mail Spam. “Opt Out” really doesn’t work all that well. When I shut off the “Redplum” adverts, I just continue to get other people’s Redplum adverts because my mail carrier doesn’t go by address; he just puts one in every slot. Grrrrrr!

        • jaset
        • 5 years ago

        The irony is strong with this one.

      • ronch
      • 5 years ago

      And I’m AMD’s secret shill around here.

      I don’t get a cent from them though. So I do a 180Β° and bash them if I think they’re being moronic. So I’m not just a shill, I’m ALSO a self-help coach!

      But not a word about this secret, OK? I’m SO SECRET that AMD themselves don’t even know they have a secret shill around here. I should put it on my rΓ©sumΓ© though.

      • cegras
      • 5 years ago

      I concur. Every time I go into a CPU news thread I just control+f for chuckula and wait for him to tell me how AMD is terrible, and to reblog the latest news about intel just to make sure I know intel is the best. I have never seen anyone with a bigger emotional investment in technology, except for Silus the rabid nvidia fanboy.

        • chuckula
        • 5 years ago

        Meh. Less impressed than with Atari’s post. At least he had the guts to be semi-original.

          • cegras
          • 5 years ago

          Except I’ve been around long enough to see your pattern. You’re very boring.

    • derFunkenstein
    • 5 years ago

    I’m glad this is out of the way, so that we can count down to Skylake. Not that my system performs poorly or anything; my 3570K OC’d to 4.5GHz is fine. But I’ve had it since 2012 and part of me wants to just irrationally spend money. πŸ˜†

      • Krogoth
      • 5 years ago

      Don’t bother, unless you want more PCIe lanes.

      Skylake isn’t going to be ground-shattering.

        • Beelzebubba9
        • 5 years ago

        I figure Skylake’s most functional benefit to the power user will be better support for NVMe PCI-E x4 SSDs. Most people won’t notice a 10% difference in CPU power, but sure will notice 2GB/sec disk IO.

          • derFunkenstein
          • 5 years ago

          And, at least in my case, I want to get up to 8 hardware threads, since I tend to run up to 3 VMs at a time when I’m doing some heavy testing. But I don’t really want to spend the money on an Ivy Bridge i7 when it’s not going to be THAT much more for Skylake (though I’d need a new mobo, too).

          If Skylake is DDR4-only, though, all bets may be off and I may just buy the Ivy Bridge. 32GB of DDR4 is kinda pricy still.

            • Ninjitsu
            • 5 years ago

            DDR4-only appears to be the case, at least on Z170. I assume Intel will be encouraging a ramp closer to Skylake’s hard launch.

            • derFunkenstein
            • 5 years ago

            I hope so, but if it’s still a big disparity between DDR4 and DDR3, I might put it off all the same.

      • BIF
      • 5 years ago

      My town used to have a “Skylake Bread Company”. Yum!

    • Thresher
    • 5 years ago

    Snore.

    Gimme Skylake.

      • ronch
      • 5 years ago

      And gimme Zen.

        • JustAnEngineer
        • 5 years ago

        As [url=http://www.youtube.com/watch?v=FC0Om7lkJtE&t=27m39s<]requested[/url<]. If Intel had provided a test sample to TR, we might be a bit more interested in Broadwell, but it's hard to get excited about a CPU that Intel doesn't care enough about to promote to reviewers. How much longer until Skylake shows up?

          • chuckula
          • 5 years ago

          HOLY CRAP! +INFINITY FOR A POST-ZEPPELIN ROBERT PLANT REFERENCE!!

    • chuckula
    • 5 years ago

    Because this has become a popular (but not necessarily accurate) meme around here, BOTH of these parts come with VT-d [and basically everything else] enabled.

    [url<]http://ark.intel.com/products/88040/Intel-Core-i7-5775C-Processor-6M-Cache-up-to-3_70-GHz[/url<] [url<]http://ark.intel.com/products/88095/Intel-Core-i5-5675C-Processor-4M-Cache-up-to-3_60-GHz[/url<] This continues the trend from last year when the overclockable Devil's Canyon 4970K and 4690K both had these features enabled too. Oh, they don't have Virtualization-for-Itanium support though, sorry about that.

      • Bauxite
      • 5 years ago

      Wait until they actually ship and some [s<]sucker[/s<] one confirms it. Ark has been dead wrong about CPU feature flags often enough, and it can take up to several months to get fixed.

        • chuckula
        • 5 years ago

        Oh, so you are saying that a lot of the hype about Intel products not support VT-d is incorrect because they actually do support it but the listings on Ark were incorrect?

        Care to provide a specific example?

          • Bauxite
          • 5 years ago

          Ever hear of TSX-NI? Though I guess a completely broken feature for a generation is too easy so:

          They had the wrong GPU on various xeons during ivy-bridge, missed AES-NI on haswell i3s for awhile, put haswell-e features on ivy-e xeons after a random website update over a year after release.

            • chuckula
            • 5 years ago

            TSX was not completely broken. After about a year of in-field use Intel & some external developers discovered that under certain extremely rare conditions there were some errors in TSX. As a precaution, Intel released firmware updates to let production system owners disable TSX… but that’s optional.

            Oh, and it’s fixed in both Haswell EX and Broadwell.

            [quote<]They had the wrong GPU on various xeons[/quote<] OK, now you're just grasping.

      • Andrew Lauritzen
      • 5 years ago

      Don’t all Broadwell systems support VT-d? Ex. lowest end i3 50005U I could find:
      [url<]http://ark.intel.com/products/84695/Intel-Core-i3-5005U-Processor-3M-Cache-2_00-GHz[/url<] Are there any 5th gen core systems that's don't? Links?

        • Ninjitsu
        • 5 years ago

        I don’t know what AnandTech means by “desktop chips” – all Intel desktop chips or just Broadwell?
        [quote<] It is basically the same chip as the Core i7 "Broadwell" desktop that Ian reviewed yesterday: inside we find four Broadwell cores and a Crystal Well-backed Iris Pro GPU, baked with Intel's state-of-the-art 14 nm process. The Xeon enables ECC RAM support, PCI-passthrough, and VT-D, the former two being features that the desktop chips obviously lack, [b<]and VT-D only being present in some desktop chips.[/b<] [/quote<] [url<]http://www.anandtech.com/show/9339/xeon-e31200-v4-launch-only-with-gpu-integrated[/url<] And I'm guessing chuckula made that post because [i<]traditionally[/i<] overclockable Intel chips did not have VT-d enabled. Devil's Canyon chips were the only exception, now joined by Broadwell-C, if I may be allowed to (confusingly) call it that.

          • Andrew Lauritzen
          • 5 years ago

          Yes it has traditionally been a bit scattershot as to which processors have VT-d enabled, but it looks like *all* 5th gen (Broadwell) cores have it, unless I’m missing one that doesn’t?

            • chuckula
            • 5 years ago

            Now that Broadwell is actually on the market can I get you to spill some information (that’s publicly accessible now).

            I did a quick-n-dirty photoshop comparison of the Crystal well chip [area known to be 77mm^2] to the Broadwell die in a photo from Anandtech’s review and I got a rough die size estimate for the 4-core Broadwell Die of around 160 – 180 mm^2. Can you give us a more precise/accurate number?

            • Andrew Lauritzen
            • 5 years ago

            Unfortunately I can’t… not specifically because it’s a secret or anything, simply because I have no idea. Typically you guys are better at measuring than me once the parts are in the wild anyways πŸ™‚

            • BIF
            • 5 years ago

            Haha, that’s all he was waiting for, physical dimensions so that he can start his own basement fabbing of socket-compatible competitor chips!

            πŸ˜€

    • Ninjitsu
    • 5 years ago

    Also worth noting: the i5-5350H is basically an i3 with Iris Pro, and could have been deadly if it wasn’t being sold for $289.

    At less than $200? Killer.

      • willmore
      • 5 years ago

      [quote<] At $200 less. Killer.[/quote<] FIFY

    • kuttan
    • 5 years ago

    If Intel can do this much improvement with iGPU performance, then Intel can now start considering entering the discrete GPU segment also. It would be great to have a 3rd GPU vendor.

    • Prestige Worldwide
    • 5 years ago

    Meh. I don’t think anyone cares about Broadwell on desktop. Skylake is just around the corner.

      • floodo1
      • 5 years ago

      Z97 owners who don’t want to upgrade their DDR3 mobos probably care a lot. I know i do.

        • Freon
        • 5 years ago

        What 1150 socket CPU do you have now that would lead you to want to upgrade to Broadwell?

          • smilingcrow
          • 5 years ago

          I’d happily swap my i5-4670K for one of these.

          • floodo1
          • 5 years ago

          G3258. Basically any dual core. At least.

    • ultima_trev
    • 5 years ago

    Saw the review at Tom’s. Iris Pro 6200 was almost as fast as GTX 750 in GTAV. By the time Cannonlake arrives on desktop, we’ll probably see IGPs on par with R9 285 / GTX 960.

    • jjj
    • 5 years ago

    Yeah look at that die, no GPU and display drivers and they can easily fit 12 cores on that die. Even then the die would be rather small for the prices they ask.
    We really need help from AMD , ARM , anyone, Intel is determined to kill the PC.

      • chuckula
      • 5 years ago

      If AMD had come out with a chip that had anywhere near the CPU power OR the GPU power OR the power efficiency that we’ve seen here.. oh and if it were compatible with existing motherboards… then we’d be hearing about how AMD is back and there’s new life in the PC market.

      When Intel does a hat-trick + 1 by doing all four things simultaneously, all we hear about is how Intel is determined to kill off PCs because … uh.. they don’t like money or something.

        • ImSpartacus
        • 5 years ago

        That’s an interesting perspective. I guess they really did just do all four at once. That’s neat.

        • jjj
        • 5 years ago

        What has Intel done? Just 4 cores and a GPU that can’t even do 1080p? And all at very high prices. This a product that can only sell on marketing , like Core M, you get screwed but you think you are getting something worth it.
        If there was any competition in the space we would be getting 16 cores or more at these prices and that’s not debatable.
        They are protecting their server and workstation business by giving consumers terrible products and forcing them to pay for pointless GPUs.
        If we buy a 50$ chip at 300$, it could at least not be mostly things we don’t need.

          • Beelzebubba9
          • 5 years ago

          [quote=”jjj”<]If there was any competition in the space we would be getting 16 cores or more at these prices and that's not debatable.[/quote<] Well if you say it's not debatable then I guess we'd better all just agree! No but seriously I can't agree with either half of that argument. There are almost no good technical reasons to put a 16C/32T AMD64 CPU in the desktop other than to cater to the e-peen of nerds. All you're looking at is die allocation, not bandwidth (socket 1150 has half what 2011 does) nor TDP (18C Xeons top out at 160W, and the i7 5960X is 140w to the i7 4790K's 88W) both of which factor heavily into system cost and usability. In the markets Intel dominates, the biggest threat to their future revenue comes from their legacy products, so even in a vacuum they have to keep pushing the cutting edge and giving their customers what they want or else their upgrade cycles will stretch on much longer and Intel doesn't make money. This is why core counts and per-socket bandwidth of their Xeons keep climbing aggressively because that matters enough in most datacenters to be worth a hardware refresh if you can double your compute density. These factors are generally irrelevant on the desktop which is why Intel has focused so much engineering on providing the features their customers can actually use - like super low power consumption or good iGPU performance and not thread counts that can only be saturated in synthetic or niche professional workloads (who should buy a Xeon anyway). It's all very simple. Intel makes their money selling CPUs. If they can make engineering changes to sell more CPUs, then they will do so. The ignorant opinions of nerds is utterly irrelevant to their business model and the behave accordingly so.

          • Andrew Lauritzen
          • 5 years ago

          > And all at very high prices
          It’s not even that much more expensive than the equivalent Haswell CPUs, with a significantly more capable iGPU and EDRAM. Certainly a far cry from the claims that Iris Pro costs $500+ that I commonly heard.

          Remember that the reason these things exist because people asked for them (maybe not you, but yeah). Reviewers constantly compare socketed chips to AMD A-series chips and say how AMD is better in graphics and so on. You can’t do that and then complain when Intel brings a chip with a beefier iGPU – as an *option* – for folks who might want that.

          If you don’t need the iGPU, then don’t buy it. If you want more cores, buy a HSW-E. But these armchair architect comments about how you could fit 20 cores into an ultrabook and run faster than devil’s canyon and Xeon combined for $100 are tiresome. If you can’t point out a chip that does all of the relevant things better, it’s pure speculation at best and more likely a vast trivialization/misunderstanding of the constraints of making a chip.

            • w76
            • 5 years ago

            I agree with the sentiment that Intel is holding back a little bit because of the lack of competition from AMD (and the fear of killing AMD entirely, because then there would be anti-trust assaults), but the crazies are going so far past the mark it’s enough to make a reasonable person an Intel defender. Intel is pushing the boundaries of what’s technologically possible, at the cost of billions of dollars, and utilizing some of the brightest people in the business. There’s no moral case against them profiting from that. It’s also certainly not their fault that AMD made a long string of awful decisions.

            And why don’t these same people strike as ruthlessly and loudly at Apple? They’ve got some of the most jaw-dropping net margins you’ll see anywhere. They literally have more money than they know what to do with. They issue bonds, just because their credit and brand is so incredible they’d be dumb not to, but barely know what to do with the proceeds from said bonds. It’s incredible. But, hippies everywhere use iPhones, so lets hate on Intel, I guess.

      • Klimax
      • 5 years ago

      How many desktop loads scale to n cores again? Almost none. Simply wasted space and frequency constrained. All bad trade offs for this market…

    • anotherengineer
    • 5 years ago

    Will there be an i3 with the iris pro 6200??

      • ImSpartacus
      • 5 years ago

      not unless amd gets their shit together.

        • chuckula
        • 5 years ago

        The good news for AMD is that HBM is the type of technology that will solve the bandwidth issues in APUs.

        The bad news for AMD is that they aren’t ready to actually produce an APU for consumer use* that includes HBM. That might happen in 2017.

        * There’s talk of a high-end “APU” that’s really a full-bore GPU + some low-end CPU cores for use in HPC, but those aren’t consumer-grade parts.

          • Beelzebubba9
          • 5 years ago

          Yeah it’d be amazing if AMD can deliver a competitive CPU core in Zen by 2017. Amazing as in it’ll probably take an act of God.

          • Andrew Lauritzen
          • 5 years ago

          > The good news for AMD is that HBM is the type of technology that will solve the bandwidth issues in APUs.

          Agreed, although do you think the sorts of consumers who care about socketed CPUs will happily give up DIMMs/RAM upgrades?

          More seriously, if you’re going to put enough HBM on to satisfy both CPU and GPU use (i.e. 8GB+ minimum) I’m not certain that you’re still going to fall in a compelling price range (for entry i3 class products) in the near future. Obviously eventually things are going to go that way, but I doubt it by 2017…

            • chuckula
            • 5 years ago

            Technically having HBM doesn’t completely preclude you from also having external DDR4… you should know since Knights Landing is using a similar-to-HBM-but-I-know-it’s-not-exactly-HBM memory on-package but still has up to 6 channels of backing DDR4.

            • Andrew Lauritzen
            • 5 years ago

            Absolutely, but it not only makes the memory controller more complicated, but it makes the entire platform (much) more expensive which exacerbates the second problem…

    • Unknown-Error
    • 5 years ago

    Just checked the reviews at Anandtech and Tomshardware, and Intel takes the integrated graphics crown from AMD by a good margin and plenty of Watts to spare. Now AMD fanbois have nothing to gloat about. $5 billion spent on ATI acquisition down the drain. Things are only going to get worse with Skylake. Carrizo won’t do squat.

    PS: The much hyped Radeon Fury (Fiji) will be slightly slower and slight worse power draw than the 980 TI. Bulldozer for GPU.

      • anotherengineer
      • 5 years ago

      Troll Alert!!!

      • ImSpartacus
      • 5 years ago

      It’s sad. I’m REALLY rooting for amd. I love the idea of an hbm apu. I hope they stay alive long enough to deliver.

      • Flying Fox
      • 5 years ago

      That $5 billion is already down the drain. I believe it has been mostly written off already.

        • AGerbilWithAFootInTheGrav
        • 5 years ago

        The wrong company bought the other one out. While not stellar, ATI managament was more operationally capable than AMD at the time, which I guess is obvious today. Largest error was going along with the takeover.

      • DragonDaddyBear
      • 5 years ago

      Their transistors are 1/2 the size. All things considered AMD is doing alright in the performance area with the crap core and old process tech.

      • NeelyCam
      • 5 years ago

      [quote<]Just checked the reviews at Anandtech and Tomshardware, and Intel takes the integrated graphics crown from AMD by a good margin and plenty of Watts to spare. [/quote<] Yes. If you're willing to pay for it... the price for Iris Pro is pretty steep.

      • Beelzebubba9
      • 5 years ago

      [quote=”Unknown-Error”<]PS: The much hyped Radeon Fury (Fiji) will be slightly slower and slight worse power draw than the 980 TI. Bulldozer for GPU.[/quote<] Why would you assume this? It'll have massively better memory bandwidth, at least, and AMD has historically been able to complete with nVidia fine on the performance front.

    • swaaye
    • 5 years ago

    That’s probably going to put the hurt on NV’s midrange notebook hardware.

    • bfar
    • 5 years ago

    I’d love to see a good overclocking analysis of these chips πŸ™‚

    The rate of GPU performance improvements generation-to-generation on Intels IGP’s are very impressive, much higher than the discreet world.

    • willmore
    • 5 years ago

    Over half the die space just wasted, how sad.

      • chuckula
      • 5 years ago

      Die pollution is a tragedy.
      [url<]http://pix-media.s3.amazonaws.com/blog/808/iron_eyes.jpg[/url<]

    • chuckula
    • 5 years ago

    Hey Damage: You do have one (or two) samples in-house for testing right?
    I know that being halfway around the world doesn’t make the process easy.

    • exilon
    • 5 years ago

    No review samples?

    Andrew Lauritzen
    Andrew Lauritzen
    Andrew Lauritzen

      • chuckula
      • 5 years ago

      No, you get the review samples from Beetlejuice:

      Beetlejuice
      Beetlejuice
      Beetlejuice!

        • exilon
        • 5 years ago

        You killed it, chuck.

      • Andrew Lauritzen
      • 5 years ago

      Sigh. All I can say is I’ve tried but have zero power when it comes to marketing-type stuff.

      Trust me, it bugs me at least as much as it bugs you… the quality of reviews at the sites that got early parts is pretty low, but it’s beyond my ability to influence πŸ™

        • chuckula
        • 5 years ago

        YOU PAID INTEL… EMPLOYEE!!

        AT LEAST USE YOUR EMPLOYEE DISCOUNT TO BUY A COUPLE AND SEND THEM TO TR!

        • exilon
        • 5 years ago

        The summoning worked!

    • Ninjitsu
    • 5 years ago

    I read AT’s and THG’s reviews, and Broadwell-C isn’t terrible by any stretch.

    Best IGP performance there is, with the i5 at $276. For that price, you [i<]also[/i<] get better performance than any AMD processor (especially ones with IGPs), and averagely the same CPU perf as an Haswell i5. Of course, you could find a $100 CPU and $150 GPU that would give you better gaming performance (I didn't like how AT used a $70 GPU), but in terms of balanced performance (and price) from a single chip, the i5-5675C is really good. The i7-5775C probably makes less sense - not enough over the i5-5675C or the 4790K except in a few benchmarks like WinRAR. Tom's Hardware has really good power consumption benchmarks, and idle time power numbers are really good (so are the load ones). Only at a combined CPU and GPU stress test does Broadwell-C exceed its TDP momentarily (but still holds it averagely). AT's going to release overclocking numbers, so lets see how they look. Waiting for TR's review too, of course.

      • JustAnEngineer
      • 5 years ago

      A $110 Radeon R7-260X 2GB card is far better than any IGP.

        • chuckula
        • 5 years ago

        That’s 100% true and here’s why Intel is happy:
        1. You didn’t say the 240.
        2. You didn’t say the 250.
        3. By the time you got to the 260X you were quoting a three digit price.

          • JustAnEngineer
          • 5 years ago

          The Radeon R7-260X has occasionally been under $100 (especially after rebate), but it is currently selling for slightly more than that. I strongly advise against buying any discrete graphics card less capable than Radeon R7-260X or GeForce GTX 750Ti with 2 GB of memory. If you are looking for less performance than these, you really [b<]should[/b<] just use the IGP.

          • Pancake
          • 5 years ago

          See, thing here is I’m an enthusiast and game with a GTX970. So, all those GPU transistors would be utterly wasted in my case.

          Get rid of that crap. Use the silicon for more cache or more cores. Or just make it smaller and cheaper with the thermal headroom for faster clocks.

            • Andrew Lauritzen
            • 5 years ago

            > Get rid of that crap. Use the silicon for more cache or more cores.
            Haswell-E.

            > Or just make it smaller and cheaper with the thermal headroom for faster clocks.
            That’s what the regular configurations with smaller GPUs are for. Note that you obviously don’t eat up any thermals when the GPU is not in use. The CPU can happily eat 65W+ all by itself πŸ™‚

            • Ninjitsu
            • 5 years ago

            Go get a 4790K, Broadwell-C isn’t for you.

            • Pancake
            • 5 years ago

            4790K isn’t for me either. Hardly any upgrade. So, like many here we’ll wait for the mighty Intel clock to tick again (or is it tock?)

            • Ninjitsu
            • 5 years ago

            In that case, what’s the complaint? You sound like you want to upgrade for the sake of it, not because you need too…which is strange, frankly.

            I mean, seriously, what’s your workload? What are your requirements? What can’t be fulfilled by the CPU you have, the 4790K, the 5280K, or the 5960X?

            I’d wager that you’d be better served by spending money on a better/another GPU than the CPU, going by what you’re saying – unless of course you’re trolling, in which case, well done I suppose. XD

            • Beelzebubba9
            • 5 years ago

            It’s this weird tick nerds have where they have to comment on the ways in which every hardware release isn’t ideal for them. Intel already makes a product that’s exactly what the user is asking for (Haswell-E) and yet they still need to post about how a product clearly targeting an entirely different niche should be identical to an existing product and therefore not exist?

            After reading 1,384 posts spitting venom at Apple for not including an RS-232 port/ethernet/dual charging ports/whatever on modern laptops by people who would never, ever buy a Mac just because they really needed to be heard I came to the conclusion that people are insane and lonely.

            • Pancake
            • 5 years ago

            I am a nerd. I upgrade maybe every 3-4 years so I buy products at that frequency. However, as a badge carrying nerd I’m interested in everything that comes out.

            I also did vote on some poll a while back that Broadwell C would be the most interesting CPU for me. I had great expectations of the L4 cache. But the chip is clocked too low and beaten by Haswell.

            • Beelzebubba9
            • 5 years ago

            …but both of the socketed Broadwell-C CPUs are unlocked so why do their lower clocks (due to the 23W lower TDP…) matter? Fixing that is just a few keystrokes away.

            • Pancake
            • 5 years ago

            Massive “whoosh” here. Broadwell-C isn’t a worthwhile upgrade for someone with a recent-ish CPU in a high end PC. That’s all I’m sayin’. Why so defensive? I’m not even sure what point you’re trying to make. Nothing to see here. Bring on Skylake.

        • Ninjitsu
        • 5 years ago

        I can’t possibly include every CPU and GPU combination under the sun that totals $276.

        And I suppose AT used a $70 GPU to remain under $270 when combined with a comparable Haswell i5.

          • JustAnEngineer
          • 5 years ago

          There is a performance threshold below which add-in GPUs are pointless.

          Although there are scads of graphics cards offered that are much less capable than Radeon R7-260X for just a few dollars less (and even some that are priced for a few dollars more), they all provide unacceptable gaming performance.

          I would be glad if Intel’s Broadwell managed to kill off all of the lousy obsolete GPUs that are still on the market like GeForce 210, 8400GS, GT610, 6200, GT720 , GT730, GT420, GT520, GT630, FX5200, GT740, 9400GT & 605 and Radeon HD5450, HD6450, HD4350, R5-230, 7000, R7-240, HD4650 & HD3650 (listed in order of increasing price for [b<]new[/b<] cards at Newegg).

            • Ninjitsu
            • 5 years ago

            Of course, I’m not disagreeing with you!

            I had meant the same when I suggested a $100 CPU + $150 GPU – you can find better combinations within the same price config.

            The cheapest Haswell i5 + R7-260X would end up marginally more expensive than the 5675C, that’s true.

            I’m just saying that the 5675C is quite a well balanced chip, especially when you compare with AMD’s APUs – since it provides considerable GPU [i<]and[/i<] CPU advantages. Still not quite a gaming solution though. To be fair, and going by AT's review, I think the 5675C may indeed manage to kill off the lousy obsolete GPUs you've listed, and it'll even have much better compute performance.

          • derFunkenstein
          • 5 years ago

          Thing is, that didn’t appear to be AT’s goal. They used a $70 GPU with a $240CPU. If they wanted to do dollars-to-dollars, they’d have to compare against a slower CPU like an i5-4430 with that R7 240.

    • odizzido
    • 5 years ago

    Look at all that space being taken up by the GPU. 8 core CPU without a GPU for the same price as the 4 core with one please.

      • Ninjitsu
      • 5 years ago

      There is very little economic precedent for that, I hope you realise. There’s also very little [i<]use[/i<] for such a chip to most consumers - I'd argue that those who could fully use a 5960X can pay for it, with few exceptions. The 5280K exists as well. For everything else, there's...MasterCard?

        • odizzido
        • 5 years ago

        Yes I know intel has no reason to offer us what we’d like because they have essentially no competition. Also I could buy a $1000+ dollar CPU, but I cannot even come close to seeing it as worth it. The 5280K is still very expensive.

        But an 8 core version of the i5 5675c for $276? That might get me to upgrade my old i5 750.

          • Ninjitsu
          • 5 years ago

          The 5280K is quite cheap for what it offers, and if you can use it. I don’t think you have enough of a use case for an 8 core if 6 cores and 12 threads at under $400 is “very expensive”.

            • ImSpartacus
            • 5 years ago

            Yeah, the 5820k is perfectly passable. It’s an excellent compromise.

        • raddude9
        • 5 years ago

        I think you are missing the obvious reason you can’t get a cheap 8-core from intel… they are protecting their server chip revenue.

          • Ninjitsu
          • 5 years ago

          I’m not – that’s exactly what I mean by there not being enough economic precedent.

          It would also not be as profitable to sell something so large to consumers at $300.

            • odizzido
            • 5 years ago

            We’re all in agreement then about why they don’t offer it. I just think it’s a damn shame they’d rather gouge their customers than offer something they’re clearly capable of.

            • Firestarter
            • 5 years ago

            you want altruism from chipzilla? That’s not how companies get this big

            • maxxcool
            • 5 years ago

            No one would buy it. only the enthusiast market, the sub <1% market. it is not worth the SKU.

            • w76
            • 5 years ago

            You say “gouge,” I say “earn a profit from the investment of billions, in machines and people, over the course of decades to a degree which few companies are even capable of dreaming of.” Point of fact: if the price of Intel’s products exceeded their value to users, users would not buy them. That’s basic logic, and basic economics.

          • Krogoth
          • 5 years ago

          Because, mainstream demand for having more than 4-threads is non-existent.

          Why sell product to a demographic that doesn’t care for it and isn’t worth the production costs (8+-core chips aren’t cheap to make due to yields).

          It is akin to trying to sell a V10 muscle car or twin-turbo diesel full-size truck to an average joe owner that has no use or desire for it.

        • ImSpartacus
        • 5 years ago

        I died at the master card quip.

        • Pancake
        • 5 years ago

        Q6600. Got one. Was a little miracle when it came out. 4 cores for $500 or whatever it was…

      • ImSpartacus
      • 5 years ago

      The edram is still on 22nm, right? So it’s cheaper.

      And 14nm was having issues, so maybe a big chip is deceptively expensive right now?

      • Beelzebubba9
      • 5 years ago

      Why doesn’t it seem to occur to anyone here that there are engineering tradeoffs in designing an 8C CPU for the desktop that are counterproductive for the vast majority of desktop and mobile workloads?

      Have you not noticed that the the i7 5960X have quad channel memory and a 140w TDP only to lose out/tie the higher clocked i7 4790K in most typical user workloads? Why would Intel make a desktop chip that is more expensive to manufacture, requires more expensive motherboards, twice the number of DIMMs and a lot more power consumption to provide no meaningful performance improvement to the types of workloads the majority of users need?

        • floodo1
        • 5 years ago

        Because beowulf cluster.

        99% chance that I buy a 5775C even though so many people say “broadwell on the desktop is [pointless]”

          • Krogoth
          • 5 years ago

          You can still *get* six and eight-core chips though. You just don’t like the prices for them. πŸ˜‰

      • Andrew Lauritzen
      • 5 years ago

      Um yeah, it doesn’t really work that way. Compare this die (and TDP) to HSW-E and Xeon D – lots of factors beyond “visually I feel like this space can fit XYZ” πŸ™‚ Also see Beelzebubba9’s comments.

      In any case I believe you can get the HSW-E 6 cores for a similar price, yes?

        • lycium
        • 5 years ago

        Awesome to see you posting here πŸ™‚ Intel are rocking it so hard atmo!

      • Voldenuit
      • 5 years ago

      [quote<]Look at all that space being taken up by the GPU. 8 core CPU without a GPU for the same price as the 4 core with one please.[/quote<] While I agree with your sentiment, I believe that GPU logic is easier to design for fault tolerance and yields, so it probably wouldn't be as simple as replacing all that GPU silicon with CPU silicon. Same with things like L3 cache, etc.

      • Bauxite
      • 5 years ago

      Rather they just chop off the monopoly-bundled GPU and pass the TDP/$/etc savings along. Then their precious server lock-in is still protected (no ECC, no large dimm, no multi socket, low pci-e lane count etc etc) and we get higher clockspeeds to pair with a nice real GPU.

      • Krogoth
      • 5 years ago

      Intel wants you to get their workstation-tier stuff for that.

      • cegras
      • 5 years ago

      Funny, where were all the people explaining why this wasn’t possible here:

      [url<]https://techreport.com/discussion/28385/amd-carrizo-brings-power-savings-to-mainstream-laptops?post=912142[/url<]

    • PrincipalSkinner
    • 5 years ago

    So 5675C is 13% more expensive than 4690K, has lower clocks and offers a few measly percent of IPC gains?
    Thanks, I’ll pass.

      • floodo1
      • 5 years ago

      you don’t care about any of the uses the that IGP has? and you don’t care about the power savings?

        • wimpishsundew
        • 5 years ago

        Most people who upgrade their own desktop is probably already running a dGPU so IGP is not really a big deal.

        The power savings is great but not necessary for a DIYer since they’re going to use the same PSU. As long as it doesn’t suck more juice than the previous chip, I’m happy.

        Honestly, you’ll be crazy if you upgrade to a 5675C from a 4690K unless electricity is ridiculously high where you live. If that was the case, just stick with a laptop.

    • chuckula
    • 5 years ago

    The reviews are starting to trickle in. I’m waiting for TR’s official review for my deeper analysis but one thing is crystal clear: Intel has a 65 watt package with a CPU that is definitively faster than an FX-9590 and a GPU that is definitively faster than the highest end “Godzillavari” parts that AMD is releasing this summer.

    What we haven’t seen is how well it overclocks just yet.

    Just remember this: Desktop Broadwell is the throw-away fugget-about-it launch from Intel this year.
    Skylake is the interesting part…..

      • ultima_trev
      • 5 years ago

      In terms of single threaded performance, sure. Granny or grandpappy on their netbook would be hardpressed to tell the difference between Broadwell and FX 9xxx for checking Facebook.

        • Beelzebubba9
        • 5 years ago

        Good thing people don’t usually buy high end desktop CPUs merely to render web pages?

        • chuckula
        • 5 years ago

        [quote<]Granny or grandpappy on their netbook would be hardpressed to tell the difference between Broadwell and FX 9xxx for checking Facebook.[/quote<] Oh, I think that the house burning down due to the fire hazard of an FX-9 series part in a netbook would be noticeable to Granny or grandpappy (but maybe not uncle Earl).

    • njoydesign
    • 5 years ago

    All of the mobile parts appear to be BGA, so no chance to have a drop-in upgrade for Haswell laptop owners. Pity.

      • chuckula
      • 5 years ago

      Yeah, Carrizo is SO MUCH BETTER…. oh wait.

        • njoydesign
        • 5 years ago

        hummm, where was I mentioning AMD there?

          • chuckula
          • 5 years ago

          The whole “oh boo hoo laptops are non-upgradeable oh, Intel is so evil!” schtick is dripping with sour grapes AMD fanboyism [or maybe outright ignorance of how laptops work].

          Technicalities like failing to actually mention AMD while you post their preformulated talking points don’t count.

            • njoydesign
            • 5 years ago

            see my reply below to Kurotetsu, who has never seen a laptop that was taken apart.

      • Kurotetsu
      • 5 years ago

      Since when have laptop owners been able to drop-in upgrade their CPU? Or do you mean laptops-in-name-only aka portable desktop replacements?

        • njoydesign
        • 5 years ago

        I don’t know where you’re coming from, but I have done so on 2 laptops of mine and on 1 of a friend. The laptop I own right now sports a socketed Haswell which gives me the possibility to replace it when I feel the need to, all the way to extreme i7 that go into gaming laptops, the cooling is sufficient to take on something more heat emitting than 47W 4700MQ.

        So my reply has nothing to do with AMD, it is my personal grunt with there being no socketed broadwell in sight and I don’t understand all the downvotes I’ve received so far. (ok that part was aimed at chuckula, so you can ignore it)

          • njoydesign
          • 5 years ago

          -3 within a minute? wow seems I’m getting some fanbase

            • ImSpartacus
            • 5 years ago

            Can’t you pay to be able to vote multiple times?

            • Melvar
            • 5 years ago

            Quick plus or minus 3 votes are probably from a single gold subscriber.

            • njoydesign
            • 5 years ago

            I just don’t get it when an arrogant and illiterate post gets +4, while me just stating the reasons for the comment I made and explaining to that poster as to why he is wrong nets me what, -6 now.

            But I guess that’s internet. I were, however, of a better opinion about TR readerbase.

            • Melvar
            • 5 years ago

            Comment votes are just anonymous grunts of approval or disapproval. They don’t imply that any particular level of thought has been given to the subject.

        • Klimax
        • 5 years ago

        Elitebook – mobile and powerful. (and not really heavy)

Pin It on Pinterest

Share This