Haswell integrated graphics keeps up with GeForce GT 650M

Intel’s integrated graphics solutions have improved immensely over the past few years. Expectations are high for the GPU built into next-generation Haswell chips, and we have an early look at what it can do. At the Consumer Electronics Show in Las Vegas, Intel is demoing the GT3 version of the integrated GPU alongside Nvidia’s GeForce GT 650M discrete solution. Our own Scott Wasson captured the side-by-side demo on video, which we’ve embedded below.

Both systems are running DiRT 3 with all the details cranked at 1080p resolution. Scott had difficulty detecting differences in performance and image quality between the two in person, and they look very similar on video. I believe the Haswell system is the one on the left.

The fact that Haswell’s IGP can run DiRT 3 at these settings is certainly impressive. So is keeping up with the GeForce GT 650M, which is a mid-range part with 384 ALUs, a 128-bit path to dedicated memory, and GPU clock speeds as high as 900MHz. The 650M is Nvidia’s fastest GT-series mobile part, and it’s the same chip used in Apple’s Retina-equipped 15" MacBook Pro.

Drivers have long been a weakness for Intel graphics solutions, but the firm is now on a regular release cadence of at least four updates per year. Intel has also released a new QuickSync SDK that has hooks for Haswell to ensure developers can have compatible software in time for the processor’s launch. Interestingly, QuickSync has gained a partially open-source dispatcher that could allow the transcoding tech to be used in open-source projects like Handbrake.

Comments closed
    • NovusBogus
    • 7 years ago

    I’ll believe this when I see it fo realz, not in a precision-engineered marketing demo. Intel made a huge investment in Larrabee and fell flat on their asses, to say nothing of their long history of terribad IGPs, so I don’t really put a lot of stock in what they have to say about graphics until they deliver the goods.

      • NeelyCam
      • 7 years ago

      Larrabee is a completely different beast… now

    • ronch
    • 7 years ago

    Did Scott check to see if the game is actually running or just a video playing on VLC?

    • MrDigi
    • 7 years ago

    Mobile GPU’s, R.I.P.

    • Bensam123
    • 7 years ago

    Intel owns Lucid right? Makes me wonder if they had a hand in this… They don’t display the frame rate, so we may be looking at a smooth experience on a lower frame rate (which isn’t necessarily a bad thing), just something worth noting.

      • Deanjo
      • 7 years ago

      Lucid is still their own company but intel has invested heavily in their company.

    • grantmeaname
    • 7 years ago

    Since Scott took the video, it’s like 480 fps and you can just slow it down and look at the stuttering, right?

    I’m fairly sure that’s how it works, anyways.

    • Arclight
    • 7 years ago

    This is marketing, until we see the numbers measured by TR, i’m Krogoth’ed.

      • MustangSally
      • 7 years ago

      [quote<] i'm Krogoth'ed[/quote<] Finally, the meme has a marketing tag-line

    • DeadOfKnight
    • 7 years ago

    This could be Atom vs. GTX 680 and it wouldn’t make a difference so long as it runs smoothly on the Atom.

    Dirt 3 isn’t a particularly demanding game, so without performance numbers this tells us absolutely nothing.

      • Farting Bob
      • 7 years ago

      It tells us that reasonably modern and graphics intensive games (DIRT 3 isnt the toughest game to run but it still requires a decent amount of horsepower to do 1080p at max settings) will run on the GT3 IGP that will be the only source of graphics in many laptops.

        • DeadOfKnight
        • 7 years ago

        It’s still a misleading comparison.

          • clone
          • 7 years ago

          it’s not misleading unless you’re looking to be misled.

          it’s a real world comparison between 2 gfx solutions using 1 game, nothing confusing about it at all.

        • clone
        • 7 years ago

        it doesn’t tell you that at all, it tells you that Intel has managed to get Dirt 3 running decently on it’s IGP and nothing more especially given Intel’s historically dismal driver support.

        at best it offers hope that Intel might have something interesting….. “might” being the operative.

    • Sahrin
    • 7 years ago

    The hardware may be fine; but notice the tightly controlled circumstances. Companies release data like this under controlled conditions for specific reasons – not just ’cause.

    Intel’s gaming achilles heal has always been drivers. And if you had trouble running complex games how would you demo your new hardware? Pick one game that uses an off-the-shelf engine, optimize the hell out of it, write a check to the developer to get them to work with you, and prevent the hardware websites from putting your chip on the bench.

    So while I’m glad Intel is finally taking graphics seriously, and the long term implications of competent hardware in the PC market are fantastic (if the baseline raises from “GeForce 4 MX” to “GT 650M” we’re doing OK) I’ll hold onto my disbelief until there is reason to do otherwise.

      • chuckula
      • 7 years ago

      [quote<]The hardware may be fine; but notice the tightly controlled circumstances. Companies release data like this under controlled conditions for specific reasons - not just 'cause. [/quote<] So you are saying that Intel's demo practices for unreleased are exactly like AMD's and Nvidia's and ARM licensee's etc. etc.? Well of course they are. I seem to remember that about this time last year AMD had a very similar canned demo for its "ultrathin" version of Trinity running and not one person ginned up conspiracy theories about how AMD can't do graphics. Interestingly enough, one full year later AMD's success with "ultrathin" parts is so lousy that TR has never reviewed a single product that includes a low-TDP Trinity part in it. I don't remember there being a huge outcry over demos from anybody else, but all of the sudden Intel is cheating? Get over it, all demos are designed to show a product in the best-possible light and it's fine as long as you have an honest context as to what the demo is about and you have honest reviews of the real products when they launch.

        • Sahrin
        • 7 years ago

        [quote<]So you are saying that Intel's demo practices for unreleased are exactly like AMD's and Nvidia's and ARM licensee's etc. etc.? [/quote<] This isn't about what Intel's demo practices say about Intel, it's about what Intel's demo practices say about Haswell. If AMD or nVidia did this (which they have in the past, and will in the future) it is usually an indicator that he product sucks. The headline crows about how Haswell's IGP is going to set a new bar for Intel IGPs...but then the facts (or lack thereof) begin to materialize.

      • swiffer
      • 7 years ago

      [quote<]Intel's gaming achilles heal has always been drivers.[/quote<]Indeed. However late AMD has been for optimizing their drivers for and fixing bugs in new non-Gaming Evolved titles, expect Intel to be [i<]at least[/i<] twice as slow. Several Inspiron Dell laptops issued to friends of mine using the now ancient HD 3000 integrated GPU with the latest official Dell drivers had severe graphics anomalies when running Guild Wars 2. To fix the corruption issue one had to update to the latest Intel drivers. Unfortunately, the latest Intel drivers didn't play well with the inverter used to drive the backlight in these Dell laptops and the backlight burned itself out in a day.

    • NeelyCam
    • 7 years ago

    [quote<]QuickSync has gained a partially open-source dispatcher that could allow the transcoding tech to be used in open-source projects like Handbrake.[/quote<] This would be great. Will I end up doing all my transcoding with an Ultrabook...?

    • sschaem
    • 7 years ago

    I think this link help a little. Assuming Intel and AMD dont lie.
    note AMD produce 3dmark result, so their 40% win is from this.
    For Intel, we just have to take their word for it.

    [url<]http://www.kitguru.net/components/graphic-cards/jules/amd-richland-a10-6800k-vs-haswell-gt3-graphics-performance-analysed/[/url<] In short, the 6800K will have a GPU that is potentially 50% faster then the GT3. And AMD will release a 28nm refresh (the 6800k is a 32nm part) at the end of the year (be nice, I heard that!) so AMD could have a huge lead in APU Graphics by the end of the year. Also, every single next gen consoles will be running AMD silicon (Even the steam box seem to be AMD powered)... this might affect greatly titles running on the PC released in 2014.

      • willmore
      • 7 years ago

      [quote<]. this might affect greatly titles running on the PC released in 2014.[/quote<] Interesting observation. Good for AMD/ATI customers.

    • derFunkenstein
    • 7 years ago

    It is worth noting that a 650M can come with both 128-bit-wide paths to GDDR5 or DDR3. If this is the DDR3 variant, performance is closer to a 640M (or a slower 640 desktop part). If it’s GDDR5, the performance is much better, obviously, with more than 2x the bandwidth.

    edit: source – [url<]http://www.notebookcheck.net/NVIDIA-GeForce-GT-650M.71887.0.html[/url<] edit 2: this is not to belittle the improvements in Haswell's iGPU, it's just worth noting that there could be a big difference between different 650M solutions.

      • tviceman
      • 7 years ago

      Also if vsync was enabled, the gt650m could have been artificially limited to 30 fps when it could have been getting 45-50 fps. Intel would not let anand see frame rates, which makes me believe they had vsync on both test machines.

        • Meadows
        • 7 years ago

        Oooh, shady shady.

        • NeelyCam
        • 7 years ago

        Clearly the most obvious explanation is the right one: since they looked so similar, they were both GT650M, and Intel just said that one is Haswell. Because Intel always lies.

          • Sahrin
          • 7 years ago

          The next competent graphics solution Intel releases will be the first.

          • phileasfogg
          • 7 years ago

          <blockquote>”Clearly the most obvious explanation is the right one:”</blockquote>

          Maybe it’s time for you to change your name to NeelyOccam’sRazor 😉

          • gmskking
          • 7 years ago

          All big corporations lie. Its the American way.

        • NeelyCam
        • 7 years ago

        From Anandtech:

        [quote<]"The video below shows Dirt 3 running at 1080p on both systems, with identical detail settings (High Quality presets, no AA, [b<]vsync off[/b<])."[/quote<] [url<]http://www.anandtech.com/show/6600/intel-haswell-gt3e-gpu-performance-compared-to-nvidias-geforce-gt-650m[/url<] Also: [quote<]"Haswell GT3 [b<]with embedded DRAM[/b<] (the fastest Haswell GPU configuration that Intel will ship)"[/quote<] I guess Charlie was right.. [url<]http://www.semiaccurate.com/2012/04/02/haswells-gpu-prowess-is-due-to-crystalwell[/url<]

          • chuckula
          • 7 years ago

          Bear in mind that Charlie made every possible prediction for Haswell’s embedded DRAM from it being 64 MB to being over 1 GB and then even went out of his way to say that the embedded DRAM chips may be cancelled… his trick to being “right” is that he conveniently censors all the articles where he makes the wrong prediction and then trumpets the one that is closest to being true….

      • My Johnson
      • 7 years ago

      Yeah, Nvidia is not clear and allows Intel a small exploit in comparisons.

      • gmskking
      • 7 years ago

      Integrated graphics will always be lacking in comparison and insufficient to run games.

    • Deanjo
    • 7 years ago

    [quote<]QuickSync has gained a partially open-source dispatcher that could allow the transcoding tech to be used in open-source projects like Handbrake.[/quote<] I wouldn't hold my breath on that. Handbrake devs are dead set against implementing hardware specific features and we still are waiting for a public release of the openCL version (which is loosing motivation to release as it seems they have a hard time matching the speeds of a mid-range i5, or so they say).

      • chuckula
      • 7 years ago

      How will they respond to AVX2? I remember that when the original AVX came out (floating-point only) some of the x264 guys weren’t interested because they only wanted integer SIMD instructions. Now with AVX2 there is integer support for 256 bit SIMD registers and AMD will get an implementation out in the future as well. Do the handbrake guys allow for specialized instructions in a general CPU vs. specialized hardware execution units?

      • Flying Fox
      • 7 years ago

      If the performance there someone may fork it to support the feature.

        • willmore
        • 7 years ago

        I don’t know if it’s ‘performance’ that’s what keeps the Handbrake guys away. They’re sort of fond of using the highest quality H.264 encoder out there. That’s not a statement of their policy, but rather an impression I’ve gotten from reading their forums.

      • NeelyCam
      • 7 years ago

      [url<]http://www.avsforum.com/t/1358343/intel-quicksync-decoder-hw-accelerated-ffdshow-decoder-with-video-processing[/url<]

        • Deanjo
        • 7 years ago

        and? How does that have anything to do with Handbrake and Quicksync?

          • NeelyCam
          • 7 years ago

          Open-source project LIKE handbrake + quicksync.

      • swiffer
      • 7 years ago

      There are public Handbrake OpenCL nightlies available now. They don’t include lookahead support (which is where most of the performance gains came from) but it’s a start.

    • jdaven
    • 7 years ago

    If the GT3 silicon can keep up with the 650M, then AMD will need an IGP at least as fast as the desktop Radeon 7750 for a comfortable, competitive lead. I wonder if AMD can fit 512 SPs in an APU and keep power under 100W.

      • chuckula
      • 7 years ago

      In a desktop power envelope I think AMD is still safe from Intel for at least 2 reasons: 1. AMD’s architecture is better when there’s enough juice available like on the deskop, and 2. Intel’s highest-end IGPs aren’t going to be available on the desktop. Trinity on the desktop won’t lose to Haswell, although Haswell will narrow the gap. The Richland update to Trinity will maintain a comfortable lead for AMD on the desktop at the ~100 watt TDP level.

      In mobile parts, however, we already see that Trinity is better than Ivy Bridge, but not insanely better. As the power envelope drops, the Intel IGPs tend to scale somewhat better because they are frankly designed for mobile and are only inefficiently overclocked to get somewhat better desktop performance. Haswell will move ahead of Trinity at 35 watts and below, but Richland will either be about as fast or a little faster than Haswell in the mobile space if AMD plays its cards right.

        • yogibbear
        • 7 years ago

        Oh I forgot about that… I thought for sure one of the lower desktop parts was going to have a GT3, e.g. the equiv haswell of an i3… but I went back and looked at the table and you’re right. 🙁 I guess it means the gaming performance of ultrabooks could be pretty decent if you can get one of the perf haswell mobile chips into the thermal spec.

        • Dposcorp
        • 7 years ago

        Why wont Intel’s highest-end IGPs be available on the desktop?
        Intel’s current fastest IGP is the HD4000, and that is available in desktop chips, or do you mean that they run at lower speeds.

        In regards to the HD400 currently available:

        “The GPU clock will also directly impact the performance, of course. Differing CPU models will have HD 4000 GPUs running at different maximum and minimum speeds, a common occurrence with Sandy Bridge notebooks and HD 3000 GPUs. For example, notebooks sporting a Core i7-3610QM will have their integrated HD 4000 GPUs operating 150MHz slower than the same GPU found in the i7-3720QM or i7-3820QM.”

        Taken from this Article.
        [url<]http://www.notebookcheck.net/Intel-HD-Graphics-4000-Benchmarked.73567.0.html[/url<]

          • chuckula
          • 7 years ago

          [quote<]Why wont Intel's highest-end IGPs be available on the desktop?[/quote<] You'll have to ask Intel. It's certainly possible that it could happen in the future, but all the leaks about Haswell model numbers indicate that the GT3 parts are not showing up on desktops. There are still some clock-speed variations between the same GPU model, but the GT3 parts include 40 execution units, and the number of available units is by far the largest factor in determining performance. The GT2 parts on desktops all include 20 EUs, which is a step up from the 16 in Ivy Bridge, but it ain't going to beat Trinity.

          • NeelyCam
          • 7 years ago

          [quote<]Why wont Intel's highest-end IGPs be available on the desktop?[/quote<] If I had to guess, I'd say that Intel market research folks determined that those who go with desktop systems probably have discrete GPUs anyway, and cutting the GPU in half means a smaller (and cheaper) chip. There's is still that "price-sensitive" casual gaming desktop market that AMD is catering to with Trinity/Richland, but maybe Intel's market research concluded that it's not worth making expensive chips for this potentially small market when those chips wouldn't be that competitive anyways.

            • willmore
            • 7 years ago

            I wonder how many different die are going to be in that family (Haswell). Will there be a GT3 die and a GT2 die? If that’s the case, it would make sense to tune the GT3 die for lower power consumption and the GT2 for higher performance.

            With that idea in mind, I’m going to go take another look at the chart of wattages for GT2 vs GT3 notebook chips. If my theory is corrent, I’d expect to see the GT2 chips suffer in terms of power efficiency.

            Then again, they could use some die harvested GT3 die for some of the GT2 products–like the ultra low power ones.

            So, without cracking the procs open and taking an SEM to the dies, we may never know.

            • jonjonjon
            • 7 years ago

            i’m guessing that intel doesn’t see it needed as much on the desktop. most people that are gaming on desktop would still buy a better discrete video card. sure it would be nice and some people would use it but on an ultrabook choices are really limited.

            my question is would adding GT3 to a desktop cpu raise the price or TDP and would it cause any performance lose? i know intel raised the TDP on the haswell desktop cpu’s which is strange because it seems like intel has been all about trying to lower TDP. so they might not be comfortable with the TDP if they used GT3.

            • NeelyCam
            • 7 years ago

            [quote<]i know intel raised the TDP on the haswell desktop cpu's which is strange because it seems like intel has been all about trying to lower TDP.[/quote<] Some folks here suggested that was because of the integrated voltage regulators

          • MadManOriginal
          • 7 years ago

          Probably to manage TDP on the desktop. Haswell TDP is going to range over a factor of 10 (10W – ~100W) and that’s not easy to do with one process. In order to keep desktop CPU performance up, the GPU had to be neutered a bit. Plus, the chips that had HD4000 IGPs initially were K series and i7 which are the ones that made the least sense. The only thing it’s useful for in those is Quicksync even though that is a great use for the IGP in a desktop. It would be great if Intel came out with lower-end desktop CPUs with GT3 earlier rather than later though.

      • My Johnson
      • 7 years ago

      Yes, we have no idea of the thermal envelope the Intel solution runs in.

    • cartman_hs
    • 7 years ago

    if we got it for “free”, why not? keep up the good work intel!

      • sschaem
      • 7 years ago

      Its not free at All. They had to remove 2 CPU core to budget the space for the GT2 on the desktop parts.

      What would you buy for $350 for your desktop?
      A 4 core haswell with a GT2
      or a 6 core haswell with no GPU

        • yogibbear
        • 7 years ago

        I think we don’t get those cores back until AMD starts to be competitive at the top end of the performance desktop chips again.

        • OneArmedScissor
        • 7 years ago

        For $350 you could be buying the rebranded Xeon of the day, which comes close to a desktop exclusive design.

        And yet, your “removed” cores don’t show up there, either, and the chip is [b<]even larger without the GPU[/b<]. You also have no idea what will represent the $350 tier for the Ivy Bridge and Haswell version. It may very well have 6 cores. ...but it absolutely will not be the same size and cost as the GPU version. They mass produce those for laptops and they're not going to suddenly switch gears and drive lower volume, high end desktop parts down.

        • cartman_hs
        • 7 years ago

        i was looking from the mobile perspective….if desktop, i guess it wouldn’t really matter as my current external graphic is still powerful enough for all my gaming…

    • yogibbear
    • 7 years ago

    Plop one of these in my PT13 silverstone case. Looking good 🙂

      • albundy
      • 7 years ago

      on hell yeah! that would be a pretty freakin sweet htpc!

    • chuckula
    • 7 years ago

    Ah snap.. and I just posted about this in the Shortbread.

    Yeah, it’s not scientific. No, the Haswell IGP isn’t as fast as the 650M. But… this is a big step up for Intel and Haswell is going to do very well at giving a good performance/power consumption balance in mobile devices. This is where the IGP really matters, and Intel has done a pretty good job with Haswell by all accounts.

Pin It on Pinterest

Share This