Personal computing discussed
Moderators: renee, morphine, SecretSquirrel
Tirk wrote:So I'm reading GPU articles and, in the comments without fail even if it is not about Nvidia hardware, there are people saying how the holy grail will come with the advent of Pascal.
Tirk wrote:Have we seen massive performance improvements with Intel's constant die shrinks?
jihadjoe wrote:Tirk wrote:Have we seen massive performance improvements with Intel's constant die shrinks?
On the server side, yes actually.
Haswell-EP has way more cores than Sandy Bridge-EP ever had, and those small IPC improvements add up and get multiplied by the number of cores.
I understand that the desktop hasn't seen much performance benefit from Intel's process advantage, but that's because desktop computing is inherently not very parallelizable. Graphics, on the other hand more closely resembles a server or HPC workload and more transistors are always welcome.
Tirk wrote:On the power note, a smaller node will indeed allow a GPU to usually consume less power. But would the higher density of the node (more difficult heat dissipation due it it being smaller) and lower restriction on power also hinder the GPU in achieving the power necessary to make a viable chip. After all, many have mentioned Samsung's 14LPE and TMSC's 20nm is not suitable for high powered chips like PC GPUs.
I am wary of these sub 28nm nodes because of how the current 20nm TMSC and 14nm LPE Samsung process are so ill-suited for large GPU designs. Are we setting ourselves up for another node disappointment?
Tirk wrote:A new architecture is hard enough; but also using a 16nm node that's intrinsically a different design beast with finfetts, and executing HBM which by itself requires a completely different design shift from gddr5 leads me to highly doubt their predictions of Pascal. I would think that it will either be heavily delayed, not implement one of these advancements correctly, or be a highly expensive halo product out of most people's budget. Which, btw, if Pascal is only a halo product will that mean most of Nvidia's gpus next year are re-brands and still have to be using gddr5? Am I naive to not believe a tech company can implement all of that at one generation flawlessly? Do any of you have examples of where a trifecta of major advancements were all implemented correctly and on time in one generation of product?
Airmantharp wrote:I gotta wonder why Samsung isn't interested in fabbing these chips for AMD and Nvidia. I'm sure there's at least one very good reason, like the process not being suited for large, high-power designs (they mostly make mobile stuff), but Samsung seems like the only company that's actually giving Intel a run for their money in the die-shrink race.
Chrispy_ wrote:Maybe I'm way too cynical, but this debate looks far too much like a turfing thread from someone with 10 total posts, 7 of which are in his own turfing thread. Even if the intention was not, it certainly looks like bait from my sofa.
Fiji's not even out. It looks like it's going to be about the same as a 980Ti, which is disappointing for competition reasons, but priced appropriately it will keep Nvidia in check. The 28nm process is the limiting factor here and AMD have gone with supercharging the memory technology to push GCN architecture into competition whilst Nvidia dialled back and cut out yet more GpGPU capabilities to selectively tweak Maxwell for current game engines.
It's two sides of the same 28nm coin, Maxwell sacrifices capabilities to optimise for a specific balance of GPU resources common in today's games, Fiji's GCN cores dividing resources and capabilities more evenly but needing HBM steroid injections to keep up in those (many) instances where the game is just better suited to Maxwell.
I'm waiting for Fiji reviews, more specifically, reviews of Air-cooled Fiji products. I'm expecting any GCN product to be hotter and hungrier than the competing Maxwell product, but AMD have two cards to play, as always:
- AMD will price their cards to be competitive based on their performance, regardless of cost to produce or the originally intended market position. If Fiji was supposed to be a Titan X killer and it's only a 980Ti, then they'll undercut the 980Ti.
- If you plan on owning an AMD card, history has proven that AMD cards age better. The 7970 and GTX680 were a close match at launch, but constant updates to GCN drivers make the 7970 a venerable performer today. Nvidia abandon their architectures like a fickle child once they have a new one, so even AAA games like Shadow of Mordor are a mess on older Nvidia cards. Same is true for The Witcher 3, if the forum whining is to be believed.