Intel teases discrete graphics card on new Twitter account

Intel's plans to release a discrete graphics product in 2020 are well-known, but just what that product will look like is not at all known. We may have a very slightly better idea today thanks to the newly-inaugurated Intel Graphics Twitter account. The company tweeted a teaser video reminding PC users that its graphics products power a huge swath of screens on the planet, and it closes with the reminder that “in 2020, we will set our graphics free.”

The teaser video shows us what appears to be a single-slot card of some kind, though the largely featureless and likely-rendered image doesn't offer much more to go on than that. Still, Intel says of its 2020 plan: “that's just the beginning.” For now, the next year and four months (or more) can't pass quickly enough.

Comments closed
    • Leader952
    • 1 year ago

    Intel teases Nvidia releases.

    [url<]https://techreport.com/news/34008/nvidia-announces-geforce-rtx-2080-ti-rtx-2080-and-rtx-2070[/url<]

    • moose17145
    • 1 year ago

    1. I will believe it when it actually happens

    2. I hope they support FreeSync/AdaptiveSync and it finally forces NVidia to also support the same. That way I could actually use that feature of my monitor with more than just AMD.

      • chuckula
      • 1 year ago

      I hope they support G-sync because I want the usual suspects to be even madder than they already are.

      • cygnus1
      • 1 year ago

      If nvidia has no problem selling plenty $1200 RTX GPU’s with gsync and no Freesync, don’t count on anything Intel does changing the gaming monitor landscape.

    • WaltC
    • 1 year ago

    “Awwww, gee…we don’t have anything now, but just wait for two more years to see what’s coming! We don’t even know ourselves–but man are we hyped…;)”

    Impressive? [not.] Sorry Intel. Come back next time with a product, and we’ll talk…;) (It’s fun, though–Long-time since I’ve seen vaporware like this, though, I have to admit…! Intel must be sweating bullets…:))

    • Chrispy_
    • 1 year ago

    I very much doubt I’ll care about the first wave of Intel dGPU, given their bottom-of-the-barrel API support and limited/buggy software interfaces for their IGPs, but at the same time I’m hoping they can compete with AMD and Nvidia.

    Honestly, the PC GPU market suffered greatly with the loss of 3DFX and PowerVR then even Matrox stopped bothering and just wrote multi-display software for AMD cards rather than design their own hardware.

    It doesn’t matter which company you prefer or support – more competition speeds up the pace of progress as it lowers prices. The only reason to want it any other way is if you’re a cash-cow shareholder that only cares about the share price and doesn’t actually need the products involved.

      • sconesy
      • 1 year ago

      True about competition. As we’ve seen in both processors and GPUs if one company bungles a generation the whole market stagnates and consumers get taken to the cleaners.

      • NovusBogus
      • 1 year ago

      Also worth noting that past behavior suggests that Intel will go after GPGPU applications. And I seem to recall much wailing and gnashing of teeth on sites like this when someone found a real world GPGPU application and the two existing manufacturers didn’t know what to do about it.

    • Sahrin
    • 1 year ago

    Great. Just what we needed in the GPU market, another quasi-monopolist that abuses its partners and treats gamers like shit.

      • chuckula
      • 1 year ago

      I see you are extremely upset about what AMD’s monopoly on game consoles has done to the market.

      I can’t disagree considering how AMD’s monopoly vice-grip on both Xbox and Playstation has resulted in severe harm to PC games via consolitis.

      Clearly as a non-koolaid drinking rational person you are extremely eager for Intel’s competition to breathe new life into the market and you are hoping for Intel to pursue the console market so that consumers finally have a real choice.

      AMIRIGHT?!??!

        • Sahrin
        • 1 year ago

        Lol, since when has Intel breathed new life into anything? Their key ‘innovation’ over the last 20 years was the P6 core.

        From 40 years ago.

          • chuckula
          • 1 year ago

          [quote<]Lol, since when has Intel breathed new life into anything? [/quote<] Well let's see here. How much of a server market do you think there would be for AMD to capture if servers didn't happen to be dominated by the same x86 architecture that AMD happens to have a license for. As for Intel never innovating anything, I thought you'd have a little more respect for the Core 2 Quad that was the first commercially successful MCM product. But I find the litany of allegedly "pro" AMD posts from you that actually sound a whole lot more like abject crap-your-pants fear being masked with a combination of bigotry and inferiority-complex noise to be amusing. You sound like the fossilized remains of 1970s era Soviet Propaganda. The funny thing is that for all your surface-level swagger, you betray the fact that you have literally zero confidence in your precious AMD's ability to compete whatsoever. That post you just made where you once again insult Intel as worthless of course raises the question: If Intel is so stupid, then why is AMD still playing second fiddle to them after all these decades? What does that make AMD? Of course, you have the easy "OMG INTEL IS EVIL BWHA!" defense mechanism that absolves you of any responsibility to produce a rational argument since poor, helpless little AMD has just been trapped for the last 40 years with insanely superior technology that literally nobody wants -- not even trillion dollar Apple in a buyout that wouldn't even amount to a rounding error in its bottom line. But because "big evil Intel" exists -- you know that company that's tiny compared to Apple -- not to mention "big evil Nvidia" -- you know, that company that used to be the small-fry compared to ATi until ATi gave up -- you don't need to supply any rational argument. You just have to exude politically correct emotional responses and hurl personal insults while actually being far more insulting to AMD than any alleged Intel or Nvidia fanboy I've ever seen around here.

            • cygnus1
            • 1 year ago

            Just want to clarify, the entire worlds PC, laptop, and server markets are dominated by the x86-64 architecture to which Intel has a license and did not develop themselves. The Intel x86 architecture dominates nothing these days.

      • TheRazorsEdge
      • 1 year ago

      It will be additional competition, which is usually the only thing that prevents consumers from getting dicked over.

      The other thing that stops companies from screwing consumers is government regulation. Since competition increases choice and decreases prices, I’ll take that any day.

      And BTW, it takes billions of dollars to design and ship a modern GPU, and you need extensive IC design experience if you want it to work. Intel is realistically the only company that can decide to push into the high-performance GPU market.

        • cygnus1
        • 1 year ago

        There are vanishingly few government regulations that actually increase competition. Most regulations that while they may have a benefit to society at large, believe it or not, generally reduce competition and create barriers to entry. Many times the cost to business of the regulations prevents new businesses from entering whatever the regulated industry/market is.

        • cygnus1
        • 1 year ago

        And also, I think ARM or another mobile GPU company (maybe even Apple) could also push into the midrange GPU market without much issue. And since that’s where the majority of the profit is, I think that’s where Intel (and any other new entrant) would focus. If they could put out something with close to GTX 1070 performance for say $200 or less, they’d eat nvidia alive.

      • K-L-Waster
      • 1 year ago

      Adding a new competitor increases monopolization? That’s… unexpected…

        • Redocbew
        • 1 year ago

        You’ve heard of CEO math. Behold Jackass math. It’s something different, but curiously similar.

        • jarder
        • 1 year ago

        Yet that’s exactly what happened in the server market. There used to be lots of different server CPU architectures (MIPS, SPARC, POWER, PA-RISC etc.) and system vendors.
        When Intel finally started making low-cost x64 server chips using commodity hardware they were able to undercut all of their competitors on price (whilst being in the same ballpark in performance) and slowly put them all out of business. The prices of course did not stay low for long once the competition had been dealt with.

        Not that I think Intel will be able to pull that masterstroke again (and masterstroke might be a bit strong as they nearly scuppered themselves with that Itanium thing). I see this mainly as a play to keep Nvidia in their place, they have been getting increasingly more powerful these past few years and have been aggressive in their pursuit of new markets like AI. Intel then, have the ability to go all “contra-revenue” on their GPUs and do some serious damage to Nvidias profitability. That could be just enough leverage to keep Nvidia in their place and not threaten that all-important status quo.

        Then again, I could be overthinking this…

    • ronch
    • 1 year ago

    Well, for a company that’s been putting graphics inside their CPUs for so long, this is bound to happen. It’s the next logical step.

      • Leader952
      • 1 year ago

      It only took them what, 15 years to figure that out.

      • Anonymous Coward
      • 1 year ago

      I would have thought the [i<]first[/i<] logical step would be to get their integrated GPUs and drivers up to AMD's standard. But apparently they want to skip investing money in a market they already sit on top of, and would rather go directly towards new money. Nevermind the aggressive company who is sitting happily on top of [i<]that[/i<] market.

    • Unknown-Error
    • 1 year ago

    That was fast.

      • ronch
      • 1 year ago

      That’s what she said!

        • Unknown-Error
        • 1 year ago

        OUCH and ROTFLMAO at the same time after reading that.

        • tipoo
        • 1 year ago

        My condolences

    • DPete27
    • 1 year ago

    Wow, that only leaves them 2 years or less to get their reputation for driver updates on par with the competition.
    Better get to it Intel. Time is ticking.

    • DavidC1
    • 1 year ago

    For the drivers side, the hope is Nvidia will give them such stiff competition they’ll find even lofty aspirations they aim for Gen 12 aren’t enough.

    That will wake them up and consider making a more well balanced product, drivers included. Assuming they don’t ditch it like other projects.

    Considering what they do with monopoly share, its good for the market that Intel does not succeed so easily.

      • Anonymous Coward
      • 1 year ago

      I foresee Intel vs nVidia going like it went with Intel and Microsoft on phones. Its probably AMD that will feel the heat.

    • Kretschmer
    • 1 year ago

    That’s cool; I’ll check it out in 2025 to see if the drivers hold up.

    • Phaleron
    • 1 year ago

    Anyone else remembering what happened with Vega and saying to themselves:

    “you know that’s cool and all but I’ve been on one Raja graphics architecture hype train already. I’m not ready for a second one, show me the independent benchmarks.”

      • Redocbew
      • 1 year ago

      That’s usually a good position no matter who makes it, but this is the Internet and all.

      • chuckula
      • 1 year ago

      Now that Raja is not at AMD and Intel is actually competing, it’s refreshing to see the same AMD fanboys who claim Vega is FINEWINE better than Pascal also claim that Vega is a failure and Raja is an idiot.

        • Antimatter
        • 1 year ago

        There’s an old saying in Tennessee. I know it’s in Texas, probably in Tennessee that says, Fool me once, shame on … shame on you. Fool me… You can’t get fooled again! – George Bush

        • Spunjji
        • 1 year ago

        Are they actually the very same fanboys, or are you generalizing again? 😉

        • Leader952
        • 1 year ago

        [quote<]and Intel is actually competing[/quote<] Cool the jets Mr Hype. All intel has released is a video of a : "single-slot card of some kind, though the largely featureless and likely-rendered image". They have not even posted the obligatory Power Point slides. Yet here you are saying that will be competing. It seems that someone has been dipping into the intel Cool Aid to much and have their Rose Color glasses on because your brain seems to be not working correctly. Maybe you are having a sun stroke being in the sun too long.

          • chuckula
          • 1 year ago

          The buttthurt is strong in you.
          But thanks for proving my point.
          Incidentally, nowhere did I ever say that Intel’s GPUs would destroy the competition.

          But given your level of anger at having the potential for more choice in GPUs I’m sure you’d like to state loudly and proudly that AMD didn’t “compete” in the CPU market from 2011 until 2017 either. Oh wait oops.. I logically applied the b.s. double standard you like in a way that’s not favorable to your biases. Sorry.

          In fact… let’s make that a standard for your hipocrisy. Let’s assume Intel’s Arctic Sound is twice the size (or more) than the competition, sucks down substantially more power, and doesn’t really win on performance… just like Bulldozer.

          You’ll either say that both parts were not “competition” consistently or claim that both parts are competition consistently.

            • Leader952
            • 1 year ago

            The point being that you are hyping a vaporware non-existent gpu that may or may not come two years from now.

            Oh and that you are really buttthurt . Thanks for proving that point.

            • chuckula
            • 1 year ago

            [quote<]The point being that you are hyping a vaporware non-existent gpu that may or may not come two years from now. [/quote<] Good, then you'll post a link to every single post you ever made attacking Zen as vaporware starting in [b<]2013[/b<] when it first got name dropped. Hell, Zen2 still hasn't been demoed yet and according to you it's practically on sale. Didn't see your posts attacking Lisa Su as an outright liar earlier this summer when she held up a heat spreader that could literally have had peanut butter underneath it and you couldn't have known the difference.

    • not@home
    • 1 year ago

    Before I buy an Intel GPU, I want to see how well Intel supports it, driver-wise. If they support it for only two or three years I will never buy any Intel GPU. If they support it for 10+ years, and make sure games run smoothly on it, then I will be tempted. I currently have a 7870 (from 2012 and still supported) and a 3870 (from 2007 and not supported anymore). In a year or two when I get around to upgrading, I will be looking at whoever has the best long term support.

      • chuckula
      • 1 year ago

      Considering Intel intentionally gimped i740 support in DX12 to force people to upgrade to Arctic Sound, I’m not confident.

        • Klimax
        • 1 year ago

        I wonder how many readers got the joke.

          • chuckula
          • 1 year ago

          Probably not enough.

          One thing I’ve found is that the first victim of fanboyism is a sense of humor, which ends up in the same shallow grave as logic and the ability to formulate an objective standard that is applied to all parties.

      • Chrispy_
      • 1 year ago

      Only today, I had to tell someone that their laptop would need to be replaced or downgraded to Windows 7 because Intel don’t make Windows 10 graphics drivers for their four-year-old laptop and the basic compatibility drivers weren’t enough.

      Meanwhile, Nvidia still go back 8 years, despite recently culling support for a whole bunch of legacy cards, and AMD go back to 2009 at least, maybe further for specific cards.

        • psuedonymous
        • 1 year ago

        [quote<]Meanwhile, Nvidia still go back 8 years, despite recently culling support for a whole bunch of legacy cards, and AMD go back to 2009 at least, maybe further for specific cards.[/quote<]Nvidia dropped support for Fermi (making Kepler the latest supported architecture this year) and AMD dropped support for everything prior to RX 2xx back in 2015.

          • barich
          • 1 year ago

          Well, that’s 2012 for Kepler and 2011 for GCN 1.0. Still a lot better than what you get out of Intel.

          Even when you do get new drivers from them, often they’re just bug/security fixes and not updates to a new WDDM version or with improved support for recent game releases or anything. Only Skylake (from late 2015) and up are getting the level of support that Kepler and GCN 1.0 get.

            • tfp
            • 1 year ago

            Yeah I’m supper upset Intel hasn’t updated it’s embedded graphics drivers for the latest game releases.

            • DancinJack
            • 1 year ago

            lol

          • Chrispy_
          • 1 year ago

          Whilst it’s true that you cannot get the latest W10 drivers for Fermi or Terascale architectures, the last posted versions still work which means that until Microsoft stop supporting WDDM 1.1, Windows 10 will continue to run with a full driver (rather than the limited compatibility driver) for these old models.

          My issue with the HD3000 in this laptop was that fairly basic 3D hardware functionality was missing in the compatibility driver, rendering a 2014 laptop (purchased new) as unfit for Windows 10 – an OS that only came out one year later. Intel never supported these, and so the most popular graphics architecture (by far) in the last decade is rendered useless by – what in many cases was an unwanted and unexpected – upgrade to Windows 10.

            • K-L-Waster
            • 1 year ago

            ^^ This.

            It’s one thing to say the available driver is long in the tooth, it’s another thing altogether to say you can’t have one at all. AMD and Nvidia may no longer be updating the drivers for older cards, but at least they have existing drivers that work.

        • Klimax
        • 1 year ago

        Which IGP is that? Because even WDDM 1.x drivers work well under 10 and all IGPs since Core 2 by Intel are fully supported under 10. (Exceptions are those few Atoms paired with IT IGP)

        4 years that would be either Sandy Bridge or Ivy Bridge. Got 4 PCs with these and they are running Windows 10 well.

        And if OEM blocked “unverified” IGP drivers you can override it by manual install.

          • Rurouni
          • 1 year ago

          I have what I believe a Sandy Bridge Pentium (G6XX) with Intel® HD Graphics for 2nd Generation Intel® Processors. Tried installing Windows 10, which is fine until it (Windows) updates the GPU driver which made the PC not displaying anything! So here, I left with a blank screen, can’t do anything since apparently Windows 10 disabled safe mode by default or something like that? Luckily it is a PC, not a laptop, thus I happened to have a discreet GPU (a GTX4xx) that I can use and I have my display back. The thing is that initially I don’t want to use a discreet GPU (trying to get that thing using as little power as possible), thus I try to use the generic driver… nope, shitty performance. So I ended up just using that GTX4xx in that PC.

          Pentium G6xx is from 2011 and GTX4xx is from 2010…..

          Actually you can search the net and find similar problem for people with the same iGPU and of course the majority of the problem came from laptop user which if they bought the Windows 10 now they have to content with shitty performance using the generic driver or they go back to Windows 7. Windows 10 came out 4 years after Pentium G6xx and Intel decided not to support it.

          I checked Intel site for their GPU drivers, and indeed for Sandy Bridge and lower is not supported in Windows 10. I’m not sure if all Sandy Bridge variant will experienced blank screen when being updated to the latest driver provided by Windows or just some very specific Sandy Brige GPU that is not compatible, but either way, it was not a good experience for me.

            • Klimax
            • 1 year ago

            You could try [url<]https://downloadcenter.intel.com/download/24971/Intel-HD-Graphics-Driver-for-Windows-7-8-64-bit?product=97498[/url<] Can't test it as I have only newer Intel HD (2 Atoms under Celeron brand). Also there might be firmware bug and update from MB maker. (Saw quite some interesting fixes in this area - best were changelogs for Intel's own mainboards, one could, learn quite a bit about it)

          • Chrispy_
          • 1 year ago

          Core i3-2348M, Toshiba Satellite from early 2014; HD3000

          Just because Ivy bridge was launched by Intel at that point doesn’t mean that brand new laptops weren’t still selling with Sandy. Hell, you can still buy brand new Skylake laptops from most major retailers, and that’s [i<]technically[/i<] a 2015 product. This is why GPU vendors dropping support early is bad, because it's likely that people are already losing 1-2 years of support from laptop manufacturers taking their sweet time to update their product lines, and then for retailers taking their time to update their stocks. Not only that, but brand new tech usually carries a price premium compared to the outgoing generation that's usually discounted by anything from about 15 to 40%. With the stagnation from Intel in the CPU market over the last decade, it not surprising that people are choosing the heavily discounted prior generation when buying a new device.

            • Klimax
            • 1 year ago

            1) Why do you think I have written:
            [quote<]4 years that would be either Sandy Bridge or Ivy Bridge.[/quote<] I explicitly took into account inertia. (Been there myself) W7/8 driver should still work properly. (Although it is possible for OEM to botch things to such degree that only OEM driver will work - I'd try checking Toshiba's download section too) I'd try [url<]https://downloadcenter.intel.com/download/24971/Intel-HD-Graphics-Driver-for-Windows-7-8-64-bit?product=97498[/url<] (Despite claiming W7/W8 support it seems to be same version as the one for W8.1) BTW: Support run for HD3000/HD2000 was three years. Just checked one of PCs with i3-2105 and it is using WU-supplied driver and there were so far no issues through out. (But it is desktop PC so I bypassed some of potential idiocies that can be caused by OEMs)

            • Klimax
            • 1 year ago

            Interestingly IGP in Ivy Bridge had support run of 5 years. (Last upadte was released 3.1.2018)

            • Chrispy_
            • 1 year ago

            I’m not disagreeing with you, simply clarifying the situation with more details

            Sadly, the OEM shenanigans are likely to be the blame, and yet that is a large percentage of all laptops. I tried forcing the W7/8 driver and it wouldn’t display an image; Booted to blackscreen and I’m pretty sure that’s just an unsupported WDDM version.

            The CAD applications that refused to run were OpenGL and as a quick test I copied GLQuake over and that also failed to run. My assumption (I honestly don’t know) is that the in-the-box driver that W10 ships with covers the lowest common denominators.

            I was actually quite surprised that I couldn’t get Sandy Bridge working because I’ve had Core2 desktops running W10 on much older hardware – the thing is that they tend to use Nvidia/AMD GPUs because Intel IGPs were so awful back then that nobody touched them.

            • Klimax
            • 1 year ago

            I’d try to enter VGA mode and force install last Intel driver. (Unpack exe and use Device manager for manual install – I bypassed OEM shenanigans this way) That should work always because Intel driver is blocked by installer, not by INF-file.

            • K-L-Waster
            • 1 year ago

            The W7 / W8 driver may still work, but W10’s more restrictive driver model may prohibit you from installing them. (I have a Rosewill wireless network adapter that supports 5GHz in hardware but will only allow 2.4GHz under W10 because W10’s default driver doesn’t support the alternate radio and it will not allow me to install the Rosewill W7 driver *that used to work just fine at 5GHz* since it isn’t signed for W10.)

            • Klimax
            • 1 year ago

            All restrictions are done on UEFI/device ID level.
            UEFI can block “unsupported” devices and/or OEMs can insert into device their own private device ID that is unknown or blocked by official 1st party driver installer. (either installer itself or extra configuration lines in inf-file)

            Windows 10 block only known broken drivers (only time I saw that block was with Starforce DRM) The only other “block” is on ancient VGA devices that have only XDDM drivers. (Windows 8 dropped XDDM altogether)

        • Ninjitsu
        • 1 year ago

        I’m pretty sure Intel supports haswell era GPUs on windows 10… since I have such a set up.

          • cygnus1
          • 1 year ago

          You’re assuming the person bought a laptop that was the latest hardware 4 years ago. I constantly have to explain to people that “yeah, your phone/laptop/tablet/whatever is only a couple years old, but the components aren’t supported any more because they’re actually 2 or 3 times older than the device they shipped in”. That crap happens all the time. Or they bought a noname or even just non-flagship Android something and it’s over a year old so they’re screwed too.

        • cygnus1
        • 1 year ago

        Aww, I’m sure they could run Windows 8, lol, no??

    • BooTs
    • 1 year ago

    That’s no PCIE connector. Must be AGP 2.0.

      • Redocbew
      • 1 year ago

      It’s a trap!

      • Klimax
      • 1 year ago

      AGP Pro

      • Wirko
      • 1 year ago

      That’s no connector. Must be a retractable wireless PCIe 4.0 x16 antenna.

      • BIF
      • 1 year ago

      Intel had its own discrete GPU as an AGP part about 18-19 years ago…what was it called, Starfighter or something? It wasn’t great, but it wasn’t bad, either.

      But then Intel stopped developing it. And that’s why I won’t buy another Intel GPU. Gotta show me you MEAN it. It must support every standard out there now and it must support CUDA. And then development has to keep going. Until then, it’s Nvidia for me.

    • Bumper
    • 1 year ago

    Not much to see, but I am excited. I’ll probably have upgraded to nvidia’s 2000 level leap before then (assuming there is not a better AMD alternative), but having an Intel discete graphics card just sounds so cool.

      • Spunjji
      • 1 year ago

      Tragically that assumption looks increasingly sound

    • RtFusion
    • 1 year ago

    Finally,

    I can truly build an RGB setup, Red (Radeon), Green (nVidia), and Blue (Intel).

      • Wirko
      • 1 year ago

      One more possible setup is Ryzen + GeForce + Brianless SSD

    • Shobai
    • 1 year ago

    [quote<] graphics is in our core[/quote<] But perhaps we'll disable it for certain 10 nm designs, amirite?

      • RtFusion
      • 1 year ago

      Ayooo!

    • Shobai
    • 1 year ago

    Their 4k Netflix comment is a convenient glossing over, if my [admittedly flaky] memory serves.

    [[url=https://techreport.com/news/31004/netflix-4k-coming-to-a-select-few-pcs<]edit[/url<]]

    • chuckula
    • 1 year ago

    If discrete video cards are clearly going to be dead by 2020, then these cards must be UNDEAD.

    Looks like we now know the codename of Navi’s successor: Chainsaw.

      • derFunkenstein
      • 1 year ago

      Chainsaw and [url=https://www.youtube.com/watch?v=zdkqagOUaPM<]Boomstick[/url<].

        • jihadjoe
        • 1 year ago

        Those are not very discreet names, that’s for sure!

      • Krogoth
      • 1 year ago

      Nah, they aren’t going to be dead. They are going to be turning into a niche within the next decade.

        • derFunkenstein
        • 1 year ago

        If only wild prognostication was dying.

          • Krogoth
          • 1 year ago

          It isn’t wild prognostication though. It is the consequence of the miniaturization that has been heart of digital computing since the beginning.

          Almost everything in a modern desktop/laptop used to be on separate discrete cards/modules. On a modern system, almost everything is on the motherboards and there’s barely any justification for having five to seven physical slots that the ATX form factor calls for. There’s nothing special about discrete GPUs that makes them immune to the miniaturization trend.

          It is only a matter of time before iGPUs and semi-integrated solutions start to offer similar levels of performance to mid-tier discrete SKUs with far less hassle. Low-end and basic 2D/3D discrete GPU market is pretty much dead due to the ubiquity of iGPUs.

            • derFunkenstein
            • 1 year ago

            So you’re…saying it’s not wild prognostication and then wildly prognosticating to back it up.

            • chuckula
            • 1 year ago

            The only people who get into more trouble over speculation than Krogoth are Intel’s CPU designers!

            • derFunkenstein
            • 1 year ago

            At least the CPU designers get executed (speculatively)

            • Redocbew
            • 1 year ago

            It appears there is something special about NICs though. The integrated networking that you find in every motherboard now could have been brought into the CPU years ago, but still nobody does it unless you’re using an SBC like the Raspberry Pi. USB controllers must be special as well, but that goes without saying. The naming convention that’s been adopted there makes it obvious that USB is [i<]very[/i<] special.

            • windwalker
            • 1 year ago

            [quote=”Krogoth”<]It is the consequence of the miniaturization that has been heart of digital computing since the beginning. Almost everything in a modern desktop/laptop used to be on separate discrete cards/modules. On a modern system, almost everything is on the motherboards and there's barely any justification for having five to seven physical slots that the ATX form factor calls for.[/quote<] All of that is true but it illustrates only the tendency towards integration. For integration to make sense the part that is a candidate for integration must be fairly stagnant or evolving at a significantly slower pace than the part it will be integrated into. For CPUs and GPUs it's obvious that the opposite is true. [quote="Krogoth"<]There's nothing special about discrete GPUs that makes them immune to the miniaturization trend. [...] Low-end and basic 2D/3D discrete GPU market is pretty much dead due to the ubiquity of iGPUs.[/quote<] GPUs are not against miniaturisation; but they are not good candidates for integration. The ultra-low-end part of the market that could be integrated already has been. The rest of the market demands more than current iGPUs provide. Just the simple fact that a good GPU has double the TDP of a good CPU (which includes an iGPU) is an indication that should be clear even for someone who doesn't know anything about the subject. To use some older terms, one can say that the market for video graphics adapters is gone, but the market for 3D graphics accelerators is not going away any time soon. [quote="Krogoth"<]It is only a matter of time before iGPUs and semi-integrated solutions start to offer similar levels of performance to mid-tier discrete SKUs with far less hassle.[/quote<] It is only a matter of time before the heat death of the Universe. I wasn't familiar with the current level of performance of iGPUs so I decided to check. To prevent as much bias as possible, I decided to settle on the iGPU model, the games and the expected level of graphics quality and performance before looking anything up. For the iGPU, I knew that whatever Apple has chosen for their just released 13" MacBook Pro must be the top performer. The set of games that came to mind as very popular and with moderate requirements are Fortnite, DotA 2 and World of Warcraft. I chose 1080p/60fps/high graphics preset as the target for good enough quality and performance. The benchmark [url=https://www.notebookcheck.net/Intel-Iris-Plus-Graphics-655-GPU-Benchmarks-and-Specs.316632.0.html<]results[/url<] I found show fairly decent performance but still very far away from the target.

        • Spunjji
        • 1 year ago

        Only if gaming does too, otherwise they will *always* offer something more than an integrated system can.

          • Krogoth
          • 1 year ago

          Actually, they just need to deliver “good enough” performance for the masses with less hassle. iGPUs and semi-integrated solutions are almost there. When that starts to happen, demand destruction for the mid-range discrete market is merely inevitable.

          We have seen this happen to other discrete stuff again and again. The most recent victims were discrete audio cards and peripheral controllers. They only exist as niches for those who need something beyond what integrated solutions can offer. The masses are perfectly content for what comes with their platform.

            • windwalker
            • 1 year ago

            Conceptually, what you say is correct.
            But I don’t see any clear indication that “iGPUs are almost there”.
            It is much harder to predict when something will happen than to predict if something will happen.

      • Redocbew
      • 1 year ago

      I bet Raja is already working on his cardio. It’s one of the rules, after all.

    • gerryg
    • 1 year ago

    I can’t tell if that’s an industrial A/C underneath, or a Mr. Fusion to power it.

      • Khali
      • 1 year ago

      My money is on a industrial flat pack type of a Fusion Core from General Atomics International.

    • Growler
    • 1 year ago

    Teaser images means Intel can’t be that discreet.

      • geniekid
      • 1 year ago

      You disgust me. Have your upvotes.

      • chuckula
      • 1 year ago

      Bender! [url=https://img.buzzfeed.com/buzzfeed-static/static/2013-10/enhanced/webdr06/16/5/enhanced-buzz-13089-1381916487-3.jpg<]It's you![/url<]

    • barich
    • 1 year ago

    I will possibly be interested in whatever discrete graphics card they’re making 5 years after they release their first one (I will generously count the i740 as a mulligan) and have proven that they can actually provide reasonable driver support.

    • Concupiscence
    • 1 year ago

    Given that the design’s called Arctic Sound and that looks like a turbine cooler, can I get away with calling it a snow blower?

      • chuckula
      • 1 year ago

      More like snow melter.

        • Concupiscence
        • 1 year ago

        I am really curious to see what they pull off here. If the prices are reasonable on a midrange part and the Linux support is there on day one, I may pull the trigger for my workstation.

          • chuckula
          • 1 year ago

          I’m not expecting any miracles because while we all like to talk about Raja joining Intel, this product isn’t some out-of-the-blue GPU that just started as soon as Raja showed up at Intel.

          The rumors are that Arctic Sound was already part way through development when Raja came over, and the original plan was that these cards were never intended to be pushed into the consumer market. Instead, they were meant for use in server clusters for various GP/GPU tasks with a heavy emphasis on accelerated video streaming.

          Raja apparently managed to convince management that this product could be a consumer GPU as well. He may very well have influenced some design changes, but this GPU is not some made-out-of-thin-air product that was originally designed for playing games.

          I tend to believe the rumors because there’s no way that Raja could show up at Intel in late 2017 and have a made-from-scratch GPU out on the market in only 3 years anyway.

            • DavidC1
            • 1 year ago

            They were planning on doing that as early as Purley looking at early documents. The graphics core being based on Cannonlake. The latter has been pushed out, but we’ll definitely see action on the server side.

    • chuckula
    • 1 year ago

    Revenge of the Raja!

    Bonus points if they troll Nvidia by reminding everybody that Larabee did ray tracing too.

    Double bonus points if they paint it red to counterpoint AMD’s pro-cards being painted blue.

      • tipoo
      • 1 year ago

      I feel like there’s been this feeling of blame for Vega on Raja online with assumptions about him being shown the door, but this story puts things into a different light if true – Raja was frustrated by 2/3rds of his resources for Vega being pulled into Navi for the PS5 instead. Interesting twist if true.

      [url<]https://www.pcgamesn.com/amd-sony-ps5-navi-affected-vega[/url<]

Pin It on Pinterest

Share This