Chris Hook wrote:It’s alive!
Personal computing discussed
Moderators: renee, morphine, SecretSquirrel
K-L-Waster wrote:https://www.techradar.com/news/intel-will-take-on-nvidia-and-amd-graphics-cards-by-mid-2020-but-gamers-could-be-disappointedThe rumors of a mid-2020 launch for the Intel Xe graphics card come from industry sources, according to DigiTimes. Apparently, the sources have said that: “Instead of purely targeting the gaming market, Intel is set to combine the new GPUs with its CPUs to create a competitive platform in a bid to pursuit business opportunities from datacenter, AI and machine learning applications.”
That means that Intel’s upcoming graphics card might not be aimed at gamers. The focus on AI and machine learning applications in particular could worry Nvidia, which has been investing heavily in those areas.
The Egg wrote:I just had a dumb thought: Have GPU makers ever considered including the ability to power their cards off an external brick powersupply?
The Egg wrote:K-L-Waster wrote:https://www.techradar.com/news/intel-will-take-on-nvidia-and-amd-graphics-cards-by-mid-2020-but-gamers-could-be-disappointedThe rumors of a mid-2020 launch for the Intel Xe graphics card come from industry sources, according to DigiTimes. Apparently, the sources have said that: “Instead of purely targeting the gaming market, Intel is set to combine the new GPUs with its CPUs to create a competitive platform in a bid to pursuit business opportunities from datacenter, AI and machine learning applications.”
That means that Intel’s upcoming graphics card might not be aimed at gamers. The focus on AI and machine learning applications in particular could worry Nvidia, which has been investing heavily in those areas.
If true, that's unfortunate. I'd like to see alot more competition and better offerings in the sub-$150 range. Even a hopped-up integrated GPU could potentially compete with a 1030 or RX 550 if given it's own dedicated GDDR. Above that, the 570 is a solid performer for the money, but it's too hot and power hungry for what should be its target market (people trying to make cheap gaming rig out of storebought PCs with barely adequate PSUs).
K-L-Waster wrote:The Egg wrote:I just had a dumb thought: Have GPU makers ever considered including the ability to power their cards off an external brick powersupply?
Not sure it would help in a desktop -- who wants to be fishing a lead from a power brick back into your case?
Concupiscence wrote:Yeah, this is a downer if it's true. I just got the parts to build a server I was hoping to put to part-time work with OpenCL, and was hoping I could grab an Intel GPU for under $200 to round things out.
Intel Ponte Vecchio is not a gaming GPU. The first Xe graphics are for exascale computing. On November 17th Intel will share details on the project “Aurora”. This exascale computer features Sapphire Rapids Xeon CPUs, Ponte Vecchio GPUs and Intel’s new initiative called oneAPI (unified programming model).
JustAnEngineer wrote:https://videocardz.com/newz/intels-first-xe-graphics-processor-is-called-ponte-vecchioIntel Ponte Vecchio is not a gaming GPU. The first Xe graphics are for exascale computing. On November 17th Intel will share details on the project “Aurora”. This exascale computer features Sapphire Rapids Xeon CPUs, Ponte Vecchio GPUs and Intel’s new initiative called oneAPI (unified programming model).
JustAnEngineer wrote:What boggles the mind is that it seems that he is simultaneously getting new supercomputers from all of the suppliers at once.
JustAnEngineer wrote:What boggles the mind is that it seems that he is simultaneously getting new supercomputers from all of the suppliers at once.
Waco wrote::lol: I have essentially zero interest in Aurora/A21. It'll be a "neat" machine but my groups tend to design/buy machines that can run our mission codes at best efficiency. Machines like El Capitan, Summit, and Frontier will run about .5% efficiency at best for the codes we run.
Igor_Kavinski wrote:Intel could have an unfair advantage in that it could optimize its drivers to make their CPU's and GPU's work in synergy using AVX instructions and some other tricks that aren't well known. They could even run the iGPU and dGPU in tandem in a sort of SLI scheme to eke out as much speed as possible. Their secret sauce could be adding special instructions to their future CPU architectures just to accelerate their GPUs.
Igor_Kavinski wrote:https://www.anandtech.com/show/15130/anandtech-exclusive-an-interview-with-intels-raja-koduri-about-xe
What's funny is how uninformative the interview is regarding what we really want to know: how competitive is their discrete GPU solution. No hints when it might hit the streets. Intel focusing on HPC first reeks of similarities with the Larrabee fiasco. Maybe their GPU is not yet ready for primetime, and probably will not be for quite a while, considering both nVidia and AMD (with their new line-up next year) moved the goalpost to raytracing as a standard feature. It was hard for Intel getting a competitive GPU made WITHOUT raytracing. It might either be underwhelming or Intel might spend half a decade playing catch up. Also, that line about impacting 200 million installed base of iGPUs, so they might raise the bar for the iGPU, which they had been doing before Raja anyway.
Aranarth wrote:Competition is ALWAYS a good thing!
Tiger Lake is "One of the first products with our Xe graphics architecture"
Intel will be manufacturing their Xe HPC class GPUs on the latest 7nm process node. This is also the lead 7nm product that Intel has talked about previously. Intel would make full use of their new and enhanced packaging technologies such as Forveros and EMIB interconnects to develop the next exascale GPUs. Just in terms of process optimizations, following are the few key improvements that Intel has announced for their 7nm process node over 10nm:
2x density scaling vs 10nm
Planned intra-node optimizations
4x reduction in design rules
EUV
Next-Gen Foveros & EMIB Packaging
JustAnEngineer wrote:https://hothardware.com/news/intel-massive-xe-hp-baap-of-all-gpu-prototype-enthusiasts
I'd bet on that behemoth being an HPC chip, not a practical gaming GPU.
The DG1 has been rumored to reflect Intel Tiger Lake specifications with rumored 96 Execution Units (EUs) – which means 768 cores. It was also expected that the card might have 3GB onboard, pointing at a (very) entry-level nature of the product.
The DG1 dGPU will be identical to the TigerLake iGPU in all regards (with the major exception of the allowed TDP of course). The entire purpose of DG1 is to serve as a sort of training wheels for the rest of the ecosystem to prepare for the arrival of Intel’s serious products later down the line.