Intel to go deep on Xe GPU info at GDC

Intel is getting ready to make a big splash with its Xe graphics cards, but the company has mostly spoken in broad terms about how it’ll scale from exascale needs down to integrated mobile chips and everything in between. Now, though, Intel is ready to talk in detail about Xe at GDC 2020 this March.

Antoine Cohade, Intel’s Senior Developer Relations Engineer, will give a presentation called a “Primer on Intel Graphics Xe Architecture” at the Game Developers’ Conference.

Here’s the official description of the presentation from the GDC 2020 Session Scheduler, spotted by Tom’s Hardware:

Intel’s brand new Xe Architecture, has been teased for a while, and is scheduled for release later this year! This update brings a significant compute, geometry and throughput improvements over todays widely used Gen9 and Gen11 graphics.

This talk will provide a detailed tour of the hardware architecture behind Intel’s upcoming GPUs – unveiling the structure behind its building blocks and their performance implications. Special consideration will be taken to explain how graphics engineers can best exploit the new Xe Architecture. We will then take an in-depth look at the powerful new features being introduced with this new architecture.

This show is for developers


Intel’s developer-only Xe DG1 discrete graphics card

GDC is, as the name suggests, a show for developers. Even so, we should likely be able to glean some interesting stuff from it. Cohade responded to questions on Twitter, too. He promised that the presentation would cover things like how Xe differs from previous generations of Intel graphics, different optimizations for discrete and integrated chips, and how Xe supports ray tracing.

It’s hard to imagine Intel making a huge splash with discrete Xe cards immediately. Integrated Xe graphics, though, will cover a huge range of laptops at a variety of power levels. That kind of market penetration could make optimizing for Xe a necessary part of game development. If Intel is aggressive about pricing and marketing, it could light a fire under Nvidia, too.

We’re looking forward to seeing what Intel has to say about Xe at GDC; maybe we’ll even get a better idea of what Xe means for gaming in the process.

avatar
3 Comment threads
7 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
6 Comment authors
KrogothchuckulaXoloreDaveAnonymous Coward Recent comment authors
  Subscribe  
newest oldest most voted
Notify of
willmore
Guest
willmore

I’m curious how they’ll fab them. Will they further overload 14nm+++ or will this be the first volume product on 10nm(+?)? Maybe they’ll farm it out to TSMC or Samsung.

If they’re at 14nm, then volumes will be low. If they’re at 10nm, that could be a game changer as it would imply that they’ve finally gotten 10nm working. If they farm it out, well, that has higher order implications to their long term business.

Does anyone know?

chuckula
Guest
chuckula

1. It’s clearly 10nm.
2. As for all of Intel’s supposed “failures” to produce 10nm parts… I just saw a $750 Dell notebook sitting out at my local Costco* using the 1065G7. Here’s a hint: If your 10nm product is in Costco, it’s widely available.

* And no, this isn’t a Costco in San Jose, it’s a generic one in the midwest.

Dave
Guest
Dave

Xe IGPs as part of Tiger Lake chips will clearly be 10nm. Supposedly the discrete GPU version will be Intel’s first 7nm part.

chuckula
Guest
chuckula

The discrete version will not be Intel’s first 7nm part unless you think they are already shipping 7nm today because they are shipping these GPUs to developers as we speak right now.

Intel has said that its first 7nm commercial product will be a GPU.
That statement is 100% accurate and in no way supports any inference that the first GPU that Intel will ship will be 7nm. It’s called logic.

Krogoth
Guest
Krogoth

10nm has been working for a bit. The problem is that silicon produced by it cannot reach the clockspeeds that Intel desires for its normal desktop and HEDT markets. The yields are still too poor to consider server-grade parts.

Krogoth
Guest
Krogoth

Here comes Raja’s real child not the still-born Vega

Xolore
Guest
Xolore

I’m pretty sure DG1 hardware was well along development before Raja got there and the same could be said for Vega honestly. I don’t think I’d call either of them Raja’s children. Navi and the upcoming “Big Navi” on the other hand…

chuckula
Guest
chuckula

The DG1 low-power variants are clearly not designed to be a RTX-2080Ti replacement.

But they are clearly going to be strong competitors in the mobile space especially since Tiger Lake + DG1 will give you a very strong graphics package in a mobile form factor that will push past what has been possible with IGPs + GPUs in the past. That should worry Nvidia given how big the mobile market is.

Krogoth
Guest
Krogoth

It is the endgame for budget-minded and mid-range discrete GPUs. They just need to be good enough for the masses.

Anonymous Coward
Guest
Anonymous Coward

Yeah will be interesting to see Intel do multi-die CPU-GPU designs, it seems like it would hard for them not to meet with a certain amount of success there.

Pin It on Pinterest

Share This